README.html 68.3 KB
Newer Older
L
Luo Tao 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34
<!DOCTYPE html>
<html class="theme theme-white">
<head>
<meta charset="utf-8">
<title>线性回归</title>
<link href="https://www.zybuluo.com/static/assets/template-theme-white.css" rel="stylesheet" media="screen">
<style type="text/css">

#wmd-preview h1  {
    color: #0077bb; /* 将标题改为蓝色 */
}</style>
</head>
<body class="theme theme-white">
<div style="visibility: hidden; overflow: hidden; position: absolute; top: 0px; height: 1px; width: auto; padding: 0px; border: 0px; margin: 0px; text-align: left; text-indent: 0px; text-transform: none; line-height: normal; letter-spacing: normal; word-spacing: normal;"><div id="MathJax_SVG_Hidden"></div><svg><defs id="MathJax_SVG_glyphs"><path id="MJMATHI-6E" stroke-width="1" d="M21 287Q22 293 24 303T36 341T56 388T89 425T135 442Q171 442 195 424T225 390T231 369Q231 367 232 367L243 378Q304 442 382 442Q436 442 469 415T503 336T465 179T427 52Q427 26 444 26Q450 26 453 27Q482 32 505 65T540 145Q542 153 560 153Q580 153 580 145Q580 144 576 130Q568 101 554 73T508 17T439 -10Q392 -10 371 17T350 73Q350 92 386 193T423 345Q423 404 379 404H374Q288 404 229 303L222 291L189 157Q156 26 151 16Q138 -11 108 -11Q95 -11 87 -5T76 7T74 17Q74 30 112 180T152 343Q153 348 153 366Q153 405 129 405Q91 405 66 305Q60 285 60 284Q58 278 41 278H27Q21 284 21 287Z"></path><path id="MJMAIN-7B" stroke-width="1" d="M434 -231Q434 -244 428 -250H410Q281 -250 230 -184Q225 -177 222 -172T217 -161T213 -148T211 -133T210 -111T209 -84T209 -47T209 0Q209 21 209 53Q208 142 204 153Q203 154 203 155Q189 191 153 211T82 231Q71 231 68 234T65 250T68 266T82 269Q116 269 152 289T203 345Q208 356 208 377T209 529V579Q209 634 215 656T244 698Q270 724 324 740Q361 748 377 749Q379 749 390 749T408 750H428Q434 744 434 732Q434 719 431 716Q429 713 415 713Q362 710 332 689T296 647Q291 634 291 499V417Q291 370 288 353T271 314Q240 271 184 255L170 250L184 245Q202 239 220 230T262 196T290 137Q291 131 291 1Q291 -134 296 -147Q306 -174 339 -192T415 -213Q429 -213 431 -216Q434 -219 434 -231Z"></path><path id="MJMATHI-79" stroke-width="1" d="M21 287Q21 301 36 335T84 406T158 442Q199 442 224 419T250 355Q248 336 247 334Q247 331 231 288T198 191T182 105Q182 62 196 45T238 27Q261 27 281 38T312 61T339 94Q339 95 344 114T358 173T377 247Q415 397 419 404Q432 431 462 431Q475 431 483 424T494 412T496 403Q496 390 447 193T391 -23Q363 -106 294 -155T156 -205Q111 -205 77 -183T43 -117Q43 -95 50 -80T69 -58T89 -48T106 -45Q150 -45 150 -87Q150 -107 138 -122T115 -142T102 -147L99 -148Q101 -153 118 -160T152 -167H160Q177 -167 186 -165Q219 -156 247 -127T290 -65T313 -9T321 21L315 17Q309 13 296 6T270 -6Q250 -11 231 -11Q185 -11 150 11T104 82Q103 89 103 113Q103 170 138 262T173 379Q173 380 173 381Q173 390 173 393T169 400T158 404H154Q131 404 112 385T82 344T65 302T57 280Q55 278 41 278H27Q21 284 21 287Z"></path><path id="MJMATHI-69" stroke-width="1" d="M184 600Q184 624 203 642T247 661Q265 661 277 649T290 619Q290 596 270 577T226 557Q211 557 198 567T184 600ZM21 287Q21 295 30 318T54 369T98 420T158 442Q197 442 223 419T250 357Q250 340 236 301T196 196T154 83Q149 61 149 51Q149 26 166 26Q175 26 185 29T208 43T235 78T260 137Q263 149 265 151T282 153Q302 153 302 143Q302 135 293 112T268 61T223 11T161 -11Q129 -11 102 10T74 74Q74 91 79 106T122 220Q160 321 166 341T173 380Q173 404 156 404H154Q124 404 99 371T61 287Q60 286 59 284T58 281T56 279T53 278T49 278T41 278H27Q21 284 21 287Z"></path><path id="MJMAIN-2C" stroke-width="1" d="M78 35T78 60T94 103T137 121Q165 121 187 96T210 8Q210 -27 201 -60T180 -117T154 -158T130 -185T117 -194Q113 -194 104 -185T95 -172Q95 -168 106 -156T131 -126T157 -76T173 -3V9L172 8Q170 7 167 6T161 3T152 1T140 0Q113 0 96 17Z"></path><path id="MJMATHI-78" stroke-width="1" d="M52 289Q59 331 106 386T222 442Q257 442 286 424T329 379Q371 442 430 442Q467 442 494 420T522 361Q522 332 508 314T481 292T458 288Q439 288 427 299T415 328Q415 374 465 391Q454 404 425 404Q412 404 406 402Q368 386 350 336Q290 115 290 78Q290 50 306 38T341 26Q378 26 414 59T463 140Q466 150 469 151T485 153H489Q504 153 504 145Q504 144 502 134Q486 77 440 33T333 -11Q263 -11 227 52Q186 -10 133 -10H127Q78 -10 57 16T35 71Q35 103 54 123T99 143Q142 143 142 101Q142 81 130 66T107 46T94 41L91 40Q91 39 97 36T113 29T132 26Q168 26 194 71Q203 87 217 139T245 247T261 313Q266 340 266 352Q266 380 251 392T217 404Q177 404 142 372T93 290Q91 281 88 280T72 278H58Q52 284 52 289Z"></path><path id="MJMAIN-31" stroke-width="1" d="M213 578L200 573Q186 568 160 563T102 556H83V602H102Q149 604 189 617T245 641T273 663Q275 666 285 666Q294 666 302 660V361L303 61Q310 54 315 52T339 48T401 46H427V0H416Q395 3 257 3Q121 3 100 0H88V46H114Q136 46 152 46T177 47T193 50T201 52T207 57T213 61V578Z"></path><path id="MJMAIN-2E" stroke-width="1" d="M78 60Q78 84 95 102T138 120Q162 120 180 104T199 61Q199 36 182 18T139 0T96 17T78 60Z"></path><path id="MJMATHI-64" stroke-width="1" d="M366 683Q367 683 438 688T511 694Q523 694 523 686Q523 679 450 384T375 83T374 68Q374 26 402 26Q411 27 422 35Q443 55 463 131Q469 151 473 152Q475 153 483 153H487H491Q506 153 506 145Q506 140 503 129Q490 79 473 48T445 8T417 -8Q409 -10 393 -10Q359 -10 336 5T306 36L300 51Q299 52 296 50Q294 48 292 46Q233 -10 172 -10Q117 -10 75 30T33 157Q33 205 53 255T101 341Q148 398 195 420T280 442Q336 442 364 400Q369 394 369 396Q370 400 396 505T424 616Q424 629 417 632T378 637H357Q351 643 351 645T353 664Q358 683 366 683ZM352 326Q329 405 277 405Q242 405 210 374T160 293Q131 214 119 129Q119 126 119 118T118 106Q118 61 136 44T179 26Q233 26 290 98L298 109L352 326Z"></path><path id="MJMAIN-7D" stroke-width="1" d="M65 731Q65 745 68 747T88 750Q171 750 216 725T279 670Q288 649 289 635T291 501Q292 362 293 357Q306 312 345 291T417 269Q428 269 431 266T434 250T431 234T417 231Q380 231 345 210T298 157Q293 143 292 121T291 -28V-79Q291 -134 285 -156T256 -198Q202 -250 89 -250Q71 -250 68 -247T65 -230Q65 -224 65 -223T66 -218T69 -214T77 -213Q91 -213 108 -210T146 -200T183 -177T207 -139Q208 -134 209 3L210 139Q223 196 280 230Q315 247 330 250Q305 257 280 270Q225 304 212 352L210 362L209 498Q208 635 207 640Q195 680 154 696T77 713Q68 713 67 716T65 731Z"></path><path id="MJMAIN-3D" stroke-width="1" d="M56 347Q56 360 70 367H707Q722 359 722 347Q722 336 708 328L390 327H72Q56 332 56 347ZM56 153Q56 168 72 173H708Q722 163 722 153Q722 140 707 133H70Q56 140 56 153Z"></path><path id="MJMAIN-2026" stroke-width="1" d="M78 60Q78 84 95 102T138 120Q162 120 180 104T199 61Q199 36 182 18T139 0T96 17T78 60ZM525 60Q525 84 542 102T585 120Q609 120 627 104T646 61Q646 36 629 18T586 0T543 17T525 60ZM972 60Q972 84 989 102T1032 120Q1056 120 1074 104T1093 61Q1093 36 1076 18T1033 0T990 17T972 60Z"></path><path id="MJMATHI-3C9" stroke-width="1" d="M495 384Q495 406 514 424T555 443Q574 443 589 425T604 364Q604 334 592 278T555 155T483 38T377 -11Q297 -11 267 66Q266 68 260 61Q201 -11 125 -11Q15 -11 15 139Q15 230 56 325T123 434Q135 441 147 436Q160 429 160 418Q160 406 140 379T94 306T62 208Q61 202 61 187Q61 124 85 100T143 76Q201 76 245 129L253 137V156Q258 297 317 297Q348 297 348 261Q348 243 338 213T318 158L308 135Q309 133 310 129T318 115T334 97T358 83T393 76Q456 76 501 148T546 274Q546 305 533 325T508 357T495 384Z"></path><path id="MJMAIN-2B" stroke-width="1" d="M56 237T56 250T70 270H369V420L370 570Q380 583 389 583Q402 583 409 568V270H707Q722 262 722 250T707 230H409V-68Q401 -82 391 -82H389H387Q375 -82 369 -68V230H70Q56 237 56 250Z"></path><path id="MJMAIN-32" stroke-width="1" d="M109 429Q82 429 66 447T50 491Q50 562 103 614T235 666Q326 666 387 610T449 465Q449 422 429 383T381 315T301 241Q265 210 201 149L142 93L218 92Q375 92 385 97Q392 99 409 186V189H449V186Q448 183 436 95T421 3V0H50V19V31Q50 38 56 46T86 81Q115 113 136 137Q145 147 170 174T204 211T233 244T261 278T284 308T305 340T320 369T333 401T340 431T343 464Q343 527 309 573T212 619Q179 619 154 602T119 569T109 550Q109 549 114 549Q132 549 151 535T170 489Q170 464 154 447T109 429Z"></path><path id="MJMATHI-62" stroke-width="1" d="M73 647Q73 657 77 670T89 683Q90 683 161 688T234 694Q246 694 246 685T212 542Q204 508 195 472T180 418L176 399Q176 396 182 402Q231 442 283 442Q345 442 383 396T422 280Q422 169 343 79T173 -11Q123 -11 82 27T40 150V159Q40 180 48 217T97 414Q147 611 147 623T109 637Q104 637 101 637H96Q86 637 83 637T76 640T73 647ZM336 325V331Q336 405 275 405Q258 405 240 397T207 376T181 352T163 330L157 322L136 236Q114 150 114 114Q114 66 138 42Q154 26 178 26Q211 26 245 58Q270 81 285 114T318 219Q336 291 336 325Z"></path><path id="MJMATHI-6A" stroke-width="1" d="M297 596Q297 627 318 644T361 661Q378 661 389 651T403 623Q403 595 384 576T340 557Q322 557 310 567T297 596ZM288 376Q288 405 262 405Q240 405 220 393T185 362T161 325T144 293L137 279Q135 278 121 278H107Q101 284 101 286T105 299Q126 348 164 391T252 441Q253 441 260 441T272 442Q296 441 316 432Q341 418 354 401T367 348V332L318 133Q267 -67 264 -75Q246 -125 194 -164T75 -204Q25 -204 7 -183T-12 -137Q-12 -110 7 -91T53 -71Q70 -71 82 -81T95 -112Q95 -148 63 -167Q69 -168 77 -168Q111 -168 139 -140T182 -74L193 -32Q204 11 219 72T251 197T278 308T289 365Q289 372 288 376Z"></path><path id="MJMATHI-59" stroke-width="1" d="M66 637Q54 637 49 637T39 638T32 641T30 647T33 664T42 682Q44 683 56 683Q104 680 165 680Q288 680 306 683H316Q322 677 322 674T320 656Q316 643 310 637H298Q242 637 242 624Q242 619 292 477T343 333L346 336Q350 340 358 349T379 373T411 410T454 461Q546 568 561 587T577 618Q577 634 545 637Q528 637 528 647Q528 649 530 661Q533 676 535 679T549 683Q551 683 578 682T657 680Q684 680 713 681T746 682Q763 682 763 673Q763 669 760 657T755 643Q753 637 734 637Q662 632 617 587Q608 578 477 424L348 273L322 169Q295 62 295 57Q295 46 363 46Q379 46 384 45T390 35Q390 33 388 23Q384 6 382 4T366 1Q361 1 324 1T232 2Q170 2 138 2T102 1Q84 1 84 9Q84 14 87 24Q88 27 89 30T90 35T91 39T93 42T96 44T101 45T107 45T116 46T129 46Q168 47 180 50T198 63Q201 68 227 171L252 274L129 623Q128 624 127 625T125 627T122 629T118 631T113 633T105 634T96 635T83 636T66 637Z"></path><path id="MJMAIN-5E" stroke-width="1" d="M112 560L249 694L257 686Q387 562 387 560L361 531Q359 532 303 581L250 627L195 580Q182 569 169 557T148 538L140 532Q138 530 125 546L112 560Z"></path><path id="MJMATHI-58" stroke-width="1" d="M42 0H40Q26 0 26 11Q26 15 29 27Q33 41 36 43T55 46Q141 49 190 98Q200 108 306 224T411 342Q302 620 297 625Q288 636 234 637H206Q200 643 200 645T202 664Q206 677 212 683H226Q260 681 347 681Q380 681 408 681T453 682T473 682Q490 682 490 671Q490 670 488 658Q484 643 481 640T465 637Q434 634 411 620L488 426L541 485Q646 598 646 610Q646 628 622 635Q617 635 609 637Q594 637 594 648Q594 650 596 664Q600 677 606 683H618Q619 683 643 683T697 681T738 680Q828 680 837 683H845Q852 676 852 672Q850 647 840 637H824Q790 636 763 628T722 611T698 593L687 584Q687 585 592 480L505 384Q505 383 536 304T601 142T638 56Q648 47 699 46Q734 46 734 37Q734 35 732 23Q728 7 725 4T711 1Q708 1 678 1T589 2Q528 2 496 2T461 1Q444 1 444 10Q444 11 446 25Q448 35 450 39T455 44T464 46T480 47T506 54Q523 62 523 64Q522 64 476 181L429 299Q241 95 236 84Q232 76 232 72Q232 53 261 47Q262 47 267 47T273 46Q276 46 277 46T280 45T283 42T284 35Q284 26 282 19Q279 6 276 4T261 1Q258 1 243 1T201 2T142 2Q64 2 42 0Z"></path><path id="MJMAIN-33" stroke-width="1" d="M127 463Q100 463 85 480T69 524Q69 579 117 622T233 665Q268 665 277 664Q351 652 390 611T430 522Q430 470 396 421T302 350L299 348Q299 347 308 345T337 336T375 315Q457 262 457 175Q457 96 395 37T238 -22Q158 -22 100 21T42 130Q42 158 60 175T105 193Q133 193 151 175T169 130Q169 119 166 110T159 94T148 82T136 74T126 70T118 67L114 66Q165 21 238 21Q293 21 321 74Q338 107 338 175V195Q338 290 274 322Q259 328 213 329L171 330L168 332Q166 335 166 348Q166 366 174 366Q202 366 232 371Q266 376 294 413T322 525V533Q322 590 287 612Q265 626 240 626Q208 626 181 615T143 592T132 580H135Q138 579 143 578T153 573T165 566T175 555T183 540T186 520Q186 498 172 481T127 463Z"></path><path id="MJMATHI-4D" stroke-width="1" d="M289 629Q289 635 232 637Q208 637 201 638T194 648Q194 649 196 659Q197 662 198 666T199 671T201 676T203 679T207 681T212 683T220 683T232 684Q238 684 262 684T307 683Q386 683 398 683T414 678Q415 674 451 396L487 117L510 154Q534 190 574 254T662 394Q837 673 839 675Q840 676 842 678T846 681L852 683H948Q965 683 988 683T1017 684Q1051 684 1051 673Q1051 668 1048 656T1045 643Q1041 637 1008 637Q968 636 957 634T939 623Q936 618 867 340T797 59Q797 55 798 54T805 50T822 48T855 46H886Q892 37 892 35Q892 19 885 5Q880 0 869 0Q864 0 828 1T736 2Q675 2 644 2T609 1Q592 1 592 11Q592 13 594 25Q598 41 602 43T625 46Q652 46 685 49Q699 52 704 61Q706 65 742 207T813 490T848 631L654 322Q458 10 453 5Q451 4 449 3Q444 0 433 0Q418 0 415 7Q413 11 374 317L335 624L267 354Q200 88 200 79Q206 46 272 46H282Q288 41 289 37T286 19Q282 3 278 1Q274 0 267 0Q265 0 255 0T221 1T157 2Q127 2 95 1T58 0Q43 0 39 2T35 11Q35 13 38 25T43 40Q45 46 65 46Q135 46 154 86Q158 92 223 354T289 629Z"></path><path id="MJMATHI-53" stroke-width="1" d="M308 24Q367 24 416 76T466 197Q466 260 414 284Q308 311 278 321T236 341Q176 383 176 462Q176 523 208 573T273 648Q302 673 343 688T407 704H418H425Q521 704 564 640Q565 640 577 653T603 682T623 704Q624 704 627 704T632 705Q645 705 645 698T617 577T585 459T569 456Q549 456 549 465Q549 471 550 475Q550 478 551 494T553 520Q553 554 544 579T526 616T501 641Q465 662 419 662Q362 662 313 616T263 510Q263 480 278 458T319 427Q323 425 389 408T456 390Q490 379 522 342T554 242Q554 216 546 186Q541 164 528 137T492 78T426 18T332 -20Q320 -22 298 -22Q199 -22 144 33L134 44L106 13Q83 -14 78 -18T65 -22Q52 -22 52 -14Q52 -11 110 221Q112 227 130 227H143Q149 221 149 216Q149 214 148 207T144 186T142 153Q144 114 160 87T203 47T255 29T308 24Z"></path><path id="MJMATHI-45" stroke-width="1" d="M492 213Q472 213 472 226Q472 230 477 250T482 285Q482 316 461 323T364 330H312Q311 328 277 192T243 52Q243 48 254 48T334 46Q428 46 458 48T518 61Q567 77 599 117T670 248Q680 270 683 272Q690 274 698 274Q718 274 718 261Q613 7 608 2Q605 0 322 0H133Q31 0 31 11Q31 13 34 25Q38 41 42 43T65 46Q92 46 125 49Q139 52 144 61Q146 66 215 342T285 622Q285 629 281 629Q273 632 228 634H197Q191 640 191 642T193 659Q197 676 203 680H757Q764 676 764 669Q764 664 751 557T737 447Q735 440 717 440H705Q698 445 698 453L701 476Q704 500 704 528Q704 558 697 578T678 609T643 625T596 632T532 634H485Q397 633 392 631Q388 629 386 622Q385 619 355 499T324 377Q347 376 372 376H398Q464 376 489 391T534 472Q538 488 540 490T557 493Q562 493 565 493T570 492T572 491T574 487T577 483L544 351Q511 218 508 216Q505 213 492 213Z"></path><path id="MJSZ2-2211" stroke-width="1" d="M60 948Q63 950 665 950H1267L1325 815Q1384 677 1388 669H1348L1341 683Q1320 724 1285 761Q1235 809 1174 838T1033 881T882 898T699 902H574H543H251L259 891Q722 258 724 252Q725 250 724 246Q721 243 460 -56L196 -356Q196 -357 407 -357Q459 -357 548 -357T676 -358Q812 -358 896 -353T1063 -332T1204 -283T1307 -196Q1328 -170 1348 -124H1388Q1388 -125 1381 -145T1356 -210T1325 -294L1267 -449L666 -450Q64 -450 61 -448Q55 -446 55 -439Q55 -437 57 -433L590 177Q590 178 557 222T452 366T322 544L56 909L55 924Q55 945 60 948Z"></path><path id="MJMAIN-28" stroke-width="1" d="M94 250Q94 319 104 381T127 488T164 576T202 643T244 695T277 729T302 750H315H319Q333 750 333 741Q333 738 316 720T275 667T226 581T184 443T167 250T184 58T225 -81T274 -167T316 -220T333 -241Q333 -250 318 -250H315H302L274 -226Q180 -141 137 -14T94 250Z"></path><path id="MJMAIN-2212" stroke-width="1" d="M84 237T84 250T98 270H679Q694 262 694 250T679 230H98Q84 237 84 250Z"></path><path id="MJMAIN-29" stroke-width="1" d="M60 749L64 750Q69 750 74 750H86L114 726Q208 641 251 514T294 250Q294 182 284 119T261 12T224 -76T186 -143T145 -194T113 -227T90 -246Q87 -249 86 -250H74Q66 -250 63 -250T58 -247T55 -238Q56 -237 66 -225Q221 -64 221 250T66 725Q56 737 55 738Q55 746 60 749Z"></path><path id="MJMAIN-38" stroke-width="1" d="M70 417T70 494T124 618T248 666Q319 666 374 624T429 515Q429 485 418 459T392 417T361 389T335 371T324 363L338 354Q352 344 366 334T382 323Q457 264 457 174Q457 95 399 37T249 -22Q159 -22 101 29T43 155Q43 263 172 335L154 348Q133 361 127 368Q70 417 70 494ZM286 386L292 390Q298 394 301 396T311 403T323 413T334 425T345 438T355 454T364 471T369 491T371 513Q371 556 342 586T275 624Q268 625 242 625Q201 625 165 599T128 534Q128 511 141 492T167 463T217 431Q224 426 228 424L286 386ZM250 21Q308 21 350 55T392 137Q392 154 387 169T375 194T353 216T330 234T301 253T274 270Q260 279 244 289T218 306L210 311Q204 311 181 294T133 239T107 157Q107 98 150 60T250 21Z"></path><path id="MJMAIN-3A" stroke-width="1" d="M78 370Q78 394 95 412T138 430Q162 430 180 414T199 371Q199 346 182 328T139 310T96 327T78 370ZM78 60Q78 84 95 102T138 120Q162 120 180 104T199 61Q199 36 182 18T139 0T96 17T78 60Z"></path></defs></svg></div><div id="wmd-preview" class="wmd-preview wmd-preview-full-reader"><div class="md-section-divider"></div><div class="md-section-divider"></div><h1 data-anchor-id="oxpz" id="线性回归">线性回归</h1><p data-anchor-id="5xhj">让我们从经典的线性回归(Linear Regression [<a href="#参考文献">1</a>])模型开始这份教程。在这一章里,你将使用真实的数据集建立起一个房价预测模型,并且了解到机器学习中的若干重要概念。</p><div class="md-section-divider"></div><h2 data-anchor-id="9ijb" id="背景介绍">背景介绍</h2><p data-anchor-id="z636">给定一个大小为<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-59-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 600.5 496.10337069604896" style="width: 1.39ex; height: 1.158ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E"></use></g></svg></span><script type="math/tex" id="MathJax-Element-59">n</script>的数据集  <span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-60-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -811.9875111244761 8313.188488672427 1126.8798169060667" style="width: 19.344ex; height: 2.664ex; vertical-align: -0.811ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-7B"></use><g transform="translate(500,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="1335" y="0"></use><g transform="translate(1780,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="345" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="3051" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2E" x="3496" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2E" x="3941" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2E" x="4386" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="4831" y="0"></use><g transform="translate(5277,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-64" x="345" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-7D" x="6563" y="0"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E" x="9990" y="675"></use><g transform="translate(7064,-287)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3D" x="345" y="0"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="1124" y="0"></use></g></g></svg></span><script type="math/tex" id="MathJax-Element-60">{\{y_{i}, x_{i1}, ..., x_{id}\}}_{i=1}^{n}</script>,其中<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-61-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -463.0516853480245 4787.188129734929 679.103370696049" style="width: 11.12ex; height: 1.622ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="345" y="0"></use></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="1270" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2026" x="1715" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="3055" y="0"></use><g transform="translate(3500,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-64" x="345" y="0"></use></g></g></g></svg></span><script type="math/tex" id="MathJax-Element-61">x_{i1}, \ldots, x_{id}</script>是第<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-62-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -682.0516853480245 345.5 714.103370696049" style="width: 0.811ex; height: 1.622ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use></g></svg></span><script type="math/tex" id="MathJax-Element-62">i</script>个样本<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-63-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -715.0516853480245 523.5 746.103370696049" style="width: 1.158ex; height: 1.737ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-64"></use></g></svg></span><script type="math/tex" id="MathJax-Element-63">d</script>个属性上的取值,<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-64-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 834.8053928999522 690.103370696049" style="width: 1.969ex; height: 1.622ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-64">y_i</script>是该样本待预测的目标。线性回归模型假设目标<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-65-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 834.8053928999522 690.103370696049" style="width: 1.969ex; height: 1.622ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-65">y_i</script>可以被属性间的线性组合描述,即</p><div class="md-section-divider"></div><p data-anchor-id="76r1"><span class="MathJax_Preview"></span><div class="MathJax_SVG_Display" role="textbox" aria-readonly="true" style="text-align: center;"><span class="MathJax_SVG" id="MathJax-Element-66-Frame" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -715.0516853480245 21191.74570299315 941.103370696049" style="width: 49.228ex; height: 2.201ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3D" x="1112" y="0"></use><g transform="translate(2168,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="880" y="-213"></use></g><g transform="translate(3245,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="345" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="4738" y="0"></use><g transform="translate(5738,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-32" x="880" y="-213"></use></g><g transform="translate(6815,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-32" x="345" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="8308" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2026" x="9308" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="10703" y="0"></use><g transform="translate(11704,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-64" x="880" y="-213"></use></g><g transform="translate(12797,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-64" x="345" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="14306" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-62" x="15307" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="15736" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="16181" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3D" x="16804" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="17861" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="18361" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2026" x="18806" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="20146" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E" x="20591" y="0"></use></g></svg></span></div><script type="math/tex; mode=display" id="MathJax-Element-66">y_i = \omega_1x_{i1} + \omega_2x_{i2} + \ldots + \omega_dx_{id} + b,  i=1,\ldots,n</script></p><p data-anchor-id="xieq">例如,在我们将要建模的房价预测问题里,<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-67-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -463.0516853480245 1208.486940139403 778.0602608392912" style="width: 2.78ex; height: 1.853ex; vertical-align: -0.811ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><g transform="translate(572,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6A" x="345" y="0"></use></g></g></svg></span><script type="math/tex" id="MathJax-Element-67">x_{ij}</script>是描述房子<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-68-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -682.0516853480245 345.5 714.103370696049" style="width: 0.811ex; height: 1.622ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use></g></svg></span><script type="math/tex" id="MathJax-Element-68">i</script>的各种属性(比如房间的个数、周围学校和医院的个数、交通状况等),而 <span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-69-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 834.8053928999522 690.103370696049" style="width: 1.969ex; height: 1.622ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-69">y_i</script>是房屋的价格。</p><p data-anchor-id="xdaz">初看起来,这个假设实在过于简单了,变量间的真实关系很难是线性的。但由于线性回归模型有形式简单和易于建模分析的优点,它在实际问题中得到了大量的应用。很多经典的统计学习、机器学习书籍[<a href="#参考文献">2,3,4</a>]也选择对线性模型独立成章重点讲解。</p><div class="md-section-divider"></div><h2 data-anchor-id="a18j" id="效果展示">效果展示</h2><p data-anchor-id="f50x">我们使用从<a href="https://archive.ics.uci.edu/ml/datasets/Housing" target="_blank">UCI Housing Data Set</a>获得的波士顿房价数据集进行模型的训练和预测。下面的散点图展示了使用模型对部分房屋价格进行的预测。其中,每个点的横坐标表示同一类房屋真实价格的中位数,纵坐标表示线性回归模型根据特征预测的结果,当二者值完全相等的时候就会落在虚线上。所以模型预测得越准确,则点离虚线越近。</p><p align="center" data-anchor-id="4jfr">
    <img src="https://raw.githubusercontent.com/PaddlePaddle/book/develop/fit_a_line/image/predictions.png" width="400"><br>
    图1. 预测值 V.S. 真实值
</p><div class="md-section-divider"></div><h2 data-anchor-id="739z" id="模型概览">模型概览</h2><div class="md-section-divider"></div><h3 data-anchor-id="tzb9" id="模型定义">模型定义</h3><p data-anchor-id="0i1k">在波士顿房价数据集中,和房屋相关的值共有14个:前13个用来描述房屋相关的各种信息,即模型中的 <span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-70-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -463.0516853480245 916.8053928999522 641.5886520702876" style="width: 2.085ex; height: 1.506ex; vertical-align: -0.463ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-78"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="809" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-70">x_i</script>;最后一个值为我们要预测的该类房屋价格的中位数,即模型中的 <span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-71-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 834.8053928999522 690.103370696049" style="width: 1.969ex; height: 1.622ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-71">y_i</script>。因此,我们的模型就可以表示成:</p><div class="md-section-divider"></div><p data-anchor-id="94vk"><span class="MathJax_Preview"></span><div class="MathJax_SVG_Display" role="textbox" aria-readonly="true" style="text-align: center;"><span class="MathJax_SVG" id="MathJax-Element-72-Frame" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -943.8583648847041 16375.58888520427 1130.1735062000191" style="width: 37.992ex; height: 2.664ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-5E" x="249" y="228"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3D" x="1041" y="0"></use><g transform="translate(2097,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="880" y="-213"></use></g><g transform="translate(3173,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-58"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="1171" y="-213"></use></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="4678" y="0"></use><g transform="translate(5679,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-32" x="880" y="-213"></use></g><g transform="translate(6755,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-58"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-32" x="1171" y="-213"></use></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="8260" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2026" x="9261" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="10655" y="0"></use><g transform="translate(11656,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><g transform="translate(622,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-33" x="500" y="0"></use></g></g><g transform="translate(13086,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-58"></use><g transform="translate(828,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-33" x="500" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2B" x="14945" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-62" x="15946" y="0"></use></g></svg></span></div><script type="math/tex; mode=display" id="MathJax-Element-72">\hat{Y} = \omega_1X_{1} + \omega_2X_{2} + \ldots + \omega_{13}X_{13} + b</script></p><p data-anchor-id="ch16"><span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-73-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -943.8583648847041 763.5 963.9100502327285" style="width: 1.737ex; height: 2.201ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-5E" x="249" y="228"></use></g></svg></span><script type="math/tex" id="MathJax-Element-73">\hat{Y}</script> 表示模型的预测结果,用来和真实值<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-74-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -704.0516853480245 763.5 724.103370696049" style="width: 1.737ex; height: 1.737ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use></g></svg></span><script type="math/tex" id="MathJax-Element-74">Y</script>区分。模型要学习的参数即:<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-75-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -715.0516853480245 5610.887498618268 931.103370696049" style="width: 13.089ex; height: 2.201ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="880" y="-213"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="1076" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2026" x="1521" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="2860" y="0"></use><g transform="translate(3305,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><g transform="translate(622,-150)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-33" x="500" y="0"></use></g></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2C" x="4736" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-62" x="5181" y="0"></use></g></svg></span><script type="math/tex" id="MathJax-Element-75">\omega_1, \ldots, \omega_{13}, b</script></p><p data-anchor-id="9qug">建立模型后,我们需要给模型一个优化目标,使得学到的参数能够让预测值<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-76-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -943.8583648847041 763.5 963.9100502327285" style="width: 1.737ex; height: 2.201ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-5E" x="249" y="228"></use></g></svg></span><script type="math/tex" id="MathJax-Element-76">\hat{Y}</script>尽可能地接近真实值<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-77-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -704.0516853480245 763.5 724.103370696049" style="width: 1.737ex; height: 1.737ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use></g></svg></span><script type="math/tex" id="MathJax-Element-77">Y</script>。这里我们引入损失函数(<a href="https://en.wikipedia.org/wiki/Loss_function" target="_blank">Loss Function</a>,或Cost Function)这个概念。 输入任意一个数据样本的目标值<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-78-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 834.8053928999522 690.103370696049" style="width: 1.969ex; height: 1.622ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-78">y_{i}</script>和模型给出的预测值<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-79-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -703.8583648847041 834.8053928999522 929.9100502327285" style="width: 1.969ex; height: 2.201ex; vertical-align: -0.579ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-79"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="693" y="-213"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-5E" x="167" y="-12"></use></g></svg></span><script type="math/tex" id="MathJax-Element-79">\hat{y_{i}}</script>,损失函数输出一个非负的实值。这个实质通常用来反映模型误差的大小。</p><p data-anchor-id="e3lc">对于线性回归模型来讲,最常见的损失函数就是均方误差(Mean Squared Error, <a href="https://en.wikipedia.org/wiki/Mean_squared_error" target="_blank">MSE</a>)了,它的形式是:</p><div class="md-section-divider"></div><p data-anchor-id="zib5"><span class="MathJax_Preview"></span><div class="MathJax_SVG_Display" role="textbox" aria-readonly="true" style="text-align: center;"><span class="MathJax_SVG" id="MathJax-Element-80-Frame" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -1585.0070961948516 10841.351063117105 2802.850285968542" style="width: 25.135ex; height: 6.486ex; vertical-align: -2.896ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-4D"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-53" x="1051" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-45" x="1697" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3D" x="2739" y="0"></use><g transform="translate(3915,0)"><rect stroke="none" width="720" height="60" x="0" y="220"></rect><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="110" y="676"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E" x="60" y="-686"></use></g><g transform="translate(4922,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJSZ2-2211"></use><g transform="translate(147,-1090)"><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3D" x="345" y="0"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-31" x="1124" y="0"></use></g><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E" x="721" y="1627"></use></g><g transform="translate(6533,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-28"></use><g transform="translate(389,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="822" y="-213"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-5E" x="212" y="228"></use></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-2212" x="1537" y="0"></use><g transform="translate(2538,0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-59"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="822" y="-213"></use></g><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-29" x="3464" y="0"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-32" x="5449" y="920"></use></g></g></svg></span></div><script type="math/tex; mode=display" id="MathJax-Element-80">MSE=\frac{1}{n}\sum_{i=1}^{n}{(\hat{Y_i}-Y_i)}^2</script></p><p data-anchor-id="etto">即对于一个大小为<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-81-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 600.5 496.10337069604896" style="width: 1.39ex; height: 1.158ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E"></use></g></svg></span><script type="math/tex" id="MathJax-Element-81">n</script>的测试集,<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-82-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -726.0516853480245 2461.5 769.103370696049" style="width: 5.676ex; height: 1.737ex; vertical-align: -0.232ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-4D"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-53" x="1051" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-45" x="1697" y="0"></use></g></svg></span><script type="math/tex" id="MathJax-Element-82">MSE</script><span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-83-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 600.5 496.10337069604896" style="width: 1.39ex; height: 1.158ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-6E"></use></g></svg></span><script type="math/tex" id="MathJax-Element-83">n</script>个数据预测结果误差平方的均值。</p><div class="md-section-divider"></div><h3 data-anchor-id="m9rr" id="训练过程">训练过程</h3><p data-anchor-id="hj08">定义好模型结构之后,我们要通过以下几个步骤进行模型训练 <br>
 1. 初始化参数,其中包括权重<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-84-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -464.0516853480245 966.8053928999522 642.5886520702876" style="width: 2.201ex; height: 1.506ex; vertical-align: -0.463ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-3C9"></use><use transform="scale(0.7071067811865476)" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-69" x="880" y="-213"></use></g></svg></span><script type="math/tex" id="MathJax-Element-84">\omega_i</script>和偏置<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-85-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -715.0516853480245 429.5 747.103370696049" style="width: 1.042ex; height: 1.737ex; vertical-align: -0.116ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMATHI-62"></use></g></svg></span><script type="math/tex" id="MathJax-Element-85">b</script>,对其进行初始化(如0均值,1方差)。 <br>
 2. 网络正向传播计算网络输出和损失函数。 <br>
 3. 根据损失函数进行反向误差传播 (<a href="https://en.wikipedia.org/wiki/Backpropagation" target="_blank">backpropagation</a>),将网络误差从输出层依次向前传递, 并更新网络中的参数。 <br>
 4. 重复2~3步骤,直至网络训练误差达到规定的程度或训练轮次达到设定值。</p><div class="md-section-divider"></div><h2 data-anchor-id="wozw" id="数据准备">数据准备</h2><p data-anchor-id="ytgk">执行以下命令来准备数据:</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="jh2a"><ol class="linenums"><li class="L0"><code class="language-bash"><span class="pln">cd data </span><span class="pun">&amp;&amp;</span><span class="pln"> python prepare_data</span><span class="pun">.</span><span class="pln">py</span></code></li></ol></pre><div class="md-section-divider"></div><div class="md-section-divider"></div><div class="md-section-divider"></div><p align="center" data-anchor-id="b8vw">
    <img src="https://raw.githubusercontent.com/PaddlePaddle/book/develop/fit_a_line/image/ranges.png" width="550"><br>
    图2. 各维属性的取值范围
</p><div class="md-section-divider"></div><h4 data-anchor-id="ye0u" id="整理训练集与测试集">整理训练集与测试集</h4><p data-anchor-id="m87z">我们将数据集分割为两份:一份用于调整模型的参数,即进行模型的训练,模型在这份数据集上的误差被称为<strong>训练误差</strong>;另外一份被用来测试,模型在这份数据集上的误差被称为<strong>测试误差</strong>。我们训练模型的目的是为了通过从训练数据中找到规律来预测未知的新数据,所以测试误差是更能反映模型表现的指标。分割数据的比例要考虑到两个因素:更多的训练数据会降低参数估计的方差,从而得到更可信的模型;而更多的测试数据会降低测试误差的方差,从而得到更可信的测试误差。一种常见的分割比例为<span class="MathJax_Preview"></span><span class="MathJax_SVG" id="MathJax-Element-86-Frame" role="textbox" aria-readonly="true" style="font-size: 100%; display: inline-block;"><svg xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 -687.0516853480245 1835.0555555555557 730.103370696049" style="width: 4.286ex; height: 1.737ex; vertical-align: -0.232ex; margin: 1px 0px;"><g stroke="black" fill="black" stroke-width="0" transform="matrix(1 0 0 -1 0 0)"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-38"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-3A" x="778" y="0"></use><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#MJMAIN-32" x="1334" y="0"></use></g></svg></span><script type="math/tex" id="MathJax-Element-86">8:2</script>,感兴趣的读者朋友们也可以尝试不同的设置来观察这两种误差的变化。</p><p data-anchor-id="gqtf">执行如下命令可以分割数据集,并将训练集和测试集的地址分别写入train.list 和 test.list两个文件中,供PaddlePaddle读取。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="g3k3"><ol class="linenums"><li class="L0"><code class="language-python"><span class="pln">python prepare_data</span><span class="pun">.</span><span class="pln">py </span><span class="pun">-</span><span class="pln">r </span><span class="lit">0.8</span><span class="pln"> </span><span class="com">#默认使用8:2的比例进行分割</span></code></li></ol></pre><p data-anchor-id="4qk8">在更复杂的模型训练过程中,我们往往还会多使用一种数据集:验证集。因为复杂的模型中常常还有一些超参数(<a href="https://en.wikipedia.org/wiki/Hyperparameter_optimization" target="_blank">Hyperparameter</a>)需要调节,所以我们会尝试多种超参数的组合来分别训练多个模型,然后对比它们在验证集上的表现选择相对最好的一组超参数,最后才使用这组参数下训练的模型在测试集上评估测试误差。由于本章训练的模型比较简单,我们暂且忽略掉这个过程。</p><div class="md-section-divider"></div><h3 data-anchor-id="qt61" id="提供数据给paddlepaddle">提供数据给PaddlePaddle</h3><p data-anchor-id="s85w">准备好数据之后,我们使用一个Python data provider来为PaddlePaddle的训练过程提供数据。一个 data provider 就是一个Python函数,它会被PaddlePaddle的训练过程调用。在这个例子里,只需要读取已经保存好的数据,然后一行一行地返回给PaddlePaddle的训练进程即可。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="yknv"><ol class="linenums"><li class="L0"><code class="language-python"><span class="kwd">from</span><span class="pln"> paddle</span><span class="pun">.</span><span class="pln">trainer</span><span class="pun">.</span><span class="typ">PyDataProvider2</span><span class="pln"> </span><span class="kwd">import</span><span class="pln"> </span><span class="pun">*</span></code></li><li class="L1"><code class="language-python"><span class="kwd">import</span><span class="pln"> numpy </span><span class="kwd">as</span><span class="pln"> np</span></code></li><li class="L2"><code class="language-python"><span class="com">#定义数据的类型和维度</span></code></li><li class="L3"><code class="language-python"><span class="lit">@provider</span><span class="pun">(</span><span class="pln">input_types</span><span class="pun">=[</span><span class="pln">dense_vector</span><span class="pun">(</span><span class="lit">13</span><span class="pun">),</span><span class="pln"> dense_vector</span><span class="pun">(</span><span class="lit">1</span><span class="pun">)])</span></code></li><li class="L4"><code class="language-python"><span class="kwd">def</span><span class="pln"> process</span><span class="pun">(</span><span class="pln">settings</span><span class="pun">,</span><span class="pln"> input_file</span><span class="pun">):</span></code></li><li class="L5"><code class="language-python"><span class="pln">    data </span><span class="pun">=</span><span class="pln"> np</span><span class="pun">.</span><span class="pln">load</span><span class="pun">(</span><span class="pln">input_file</span><span class="pun">.</span><span class="pln">strip</span><span class="pun">())</span></code></li><li class="L6"><code class="language-python"><span class="pln">    </span><span class="kwd">for</span><span class="pln"> row </span><span class="kwd">in</span><span class="pln"> data</span><span class="pun">:</span></code></li><li class="L7"><code class="language-python"><span class="pln">        </span><span class="kwd">yield</span><span class="pln"> row</span><span class="pun">[:-</span><span class="lit">1</span><span class="pun">].</span><span class="pln">tolist</span><span class="pun">(),</span><span class="pln"> row</span><span class="pun">[-</span><span class="lit">1</span><span class="pun">:].</span><span class="pln">tolist</span><span class="pun">()</span></code></li></ol></pre><div class="md-section-divider"></div><h2 data-anchor-id="9xee" id="模型配置说明">模型配置说明</h2><div class="md-section-divider"></div><h3 data-anchor-id="jcsa" id="数据定义">数据定义</h3><p data-anchor-id="rq5t">首先,通过 <code>define_py_data_sources2</code> 来配置PaddlePaddle从上面的<code>dataprovider.py</code>里读入训练数据和测试数据。 PaddlePaddle接受从命令行读入的配置信息,例如这里我们传入一个名为<code>is_predict</code>的变量来控制模型在训练和测试时的不同结构。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="v6zd"><ol class="linenums"><li class="L0"><code class="language-python"><span class="kwd">from</span><span class="pln"> paddle</span><span class="pun">.</span><span class="pln">trainer_config_helpers </span><span class="kwd">import</span><span class="pln"> </span><span class="pun">*</span></code></li><li class="L1"><code class="language-python"></code></li><li class="L2"><code class="language-python"><span class="pln">is_predict </span><span class="pun">=</span><span class="pln"> get_config_arg</span><span class="pun">(</span><span class="str">'is_predict'</span><span class="pun">,</span><span class="pln"> bool</span><span class="pun">,</span><span class="pln"> </span><span class="kwd">False</span><span class="pun">)</span></code></li><li class="L3"><code class="language-python"></code></li><li class="L4"><code class="language-python"><span class="pln">define_py_data_sources2</span><span class="pun">(</span></code></li><li class="L5"><code class="language-python"><span class="pln">    train_list</span><span class="pun">=</span><span class="str">'data/train.list'</span><span class="pun">,</span></code></li><li class="L6"><code class="language-python"><span class="pln">    test_list</span><span class="pun">=</span><span class="str">'data/test.list'</span><span class="pun">,</span></code></li><li class="L7"><code class="language-python"><span class="pln">    module</span><span class="pun">=</span><span class="str">'dataprovider'</span><span class="pun">,</span></code></li><li class="L8"><code class="language-python"><span class="pln">    obj</span><span class="pun">=</span><span class="str">'process'</span><span class="pun">)</span></code></li></ol></pre><div class="md-section-divider"></div><h3 data-anchor-id="0tzy" id="算法配置">算法配置</h3><p data-anchor-id="0n6j">接着,指定模型优化算法的细节。由于线性回归模型比较简单,我们只要设置基本的<code>batch_size</code>即可,它指定每次更新参数的时候使用多少条数据计算梯度信息。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="klfv"><ol class="linenums"><li class="L0"><code class="language-python"><span class="pln">settings</span><span class="pun">(</span><span class="pln">batch_size</span><span class="pun">=</span><span class="lit">2</span><span class="pun">)</span></code></li></ol></pre><div class="md-section-divider"></div><h3 data-anchor-id="omgp" id="网络结构">网络结构</h3><p data-anchor-id="8a14">最后,使用<code>fc_layer</code><code>LinearActivation</code>来表示线性回归的模型本身。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="v0zt"><ol class="linenums"><li class="L0"><code class="language-python"><span class="com">#输入数据,13维的房屋信息</span></code></li><li class="L1"><code class="language-python"><span class="pln">x </span><span class="pun">=</span><span class="pln"> data_layer</span><span class="pun">(</span><span class="pln">name</span><span class="pun">=</span><span class="str">'x'</span><span class="pun">,</span><span class="pln"> size</span><span class="pun">=</span><span class="lit">13</span><span class="pun">)</span></code></li><li class="L2"><code class="language-python"></code></li><li class="L3"><code class="language-python"><span class="pln">y_predict </span><span class="pun">=</span><span class="pln"> fc_layer</span><span class="pun">(</span></code></li><li class="L4"><code class="language-python"><span class="pln">    input</span><span class="pun">=</span><span class="pln">x</span><span class="pun">,</span></code></li><li class="L5"><code class="language-python"><span class="pln">    param_attr</span><span class="pun">=</span><span class="typ">ParamAttr</span><span class="pun">(</span><span class="pln">name</span><span class="pun">=</span><span class="str">'w'</span><span class="pun">),</span></code></li><li class="L6"><code class="language-python"><span class="pln">    size</span><span class="pun">=</span><span class="lit">1</span><span class="pun">,</span></code></li><li class="L7"><code class="language-python"><span class="pln">    act</span><span class="pun">=</span><span class="typ">LinearActivation</span><span class="pun">(),</span></code></li><li class="L8"><code class="language-python"><span class="pln">    bias_attr</span><span class="pun">=</span><span class="typ">ParamAttr</span><span class="pun">(</span><span class="pln">name</span><span class="pun">=</span><span class="str">'b'</span><span class="pun">))</span></code></li><li class="L9"><code class="language-python"></code></li><li class="L0"><code class="language-python"><span class="kwd">if</span><span class="pln"> </span><span class="kwd">not</span><span class="pln"> is_predict</span><span class="pun">:</span><span class="pln"> </span><span class="com">#训练时,我们使用MSE,即regression_cost作为损失函数</span></code></li><li class="L1"><code class="language-python"><span class="pln">    y </span><span class="pun">=</span><span class="pln"> data_layer</span><span class="pun">(</span><span class="pln">name</span><span class="pun">=</span><span class="str">'y'</span><span class="pun">,</span><span class="pln"> size</span><span class="pun">=</span><span class="lit">1</span><span class="pun">)</span></code></li><li class="L2"><code class="language-python"><span class="pln">    cost </span><span class="pun">=</span><span class="pln"> regression_cost</span><span class="pun">(</span><span class="pln">input</span><span class="pun">=</span><span class="pln">y_predict</span><span class="pun">,</span><span class="pln"> label</span><span class="pun">=</span><span class="pln">y</span><span class="pun">)</span></code></li><li class="L3"><code class="language-python"><span class="pln">    outputs</span><span class="pun">(</span><span class="pln">cost</span><span class="pun">)</span><span class="pln"> </span><span class="com">#训练时输出MSE来监控损失的变化</span></code></li><li class="L4"><code class="language-python"><span class="kwd">else</span><span class="pun">:</span><span class="pln"> </span><span class="com">#测试时,输出预测值</span></code></li><li class="L5"><code class="language-python"><span class="pln">    outputs</span><span class="pun">(</span><span class="pln">y_predict</span><span class="pun">)</span></code></li></ol></pre><div class="md-section-divider"></div><h2 data-anchor-id="ak2p" id="训练模型">训练模型</h2><p data-anchor-id="e18b">在对应代码的根目录下执行PaddlePaddle的命令行训练程序。这里指定模型配置文件为<code>trainer_config.py</code>,训练30轮,结果保存在<code>output</code>路径下。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="k7nw"><ol class="linenums"><li class="L0"><code class="language-bash"><span class="pun">./</span><span class="pln">train</span><span class="pun">.</span><span class="pln">sh</span></code></li></ol></pre><div class="md-section-divider"></div><h2 data-anchor-id="tjol" id="应用模型">应用模型</h2><p data-anchor-id="z7d8">现在来看下如何使用已经训练好的模型进行预测。</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="ew76"><ol class="linenums"><li class="L0"><code class="language-bash"><span class="pln">python predict</span><span class="pun">.</span><span class="pln">py</span></code></li></ol></pre><p data-anchor-id="1bsn">这里默认使用<code>output/pass-00029</code>中保存的模型进行预测,并将数据中的房价与预测结果进行对比,结果保存在 <code>predictions.png</code>中。 <br>
如果你想使用别的模型或者其它的数据进行预测,只要传入新的路径即可:</p><div class="md-section-divider"></div><pre class="prettyprint linenums prettyprinted" data-anchor-id="9q38"><ol class="linenums"><li class="L0"><code class="language-bash"><span class="pln">python predict</span><span class="pun">.</span><span class="pln">py </span><span class="pun">-</span><span class="pln">m output</span><span class="pun">/</span><span class="pln">pass</span><span class="pun">-</span><span class="lit">00020</span><span class="pln"> </span><span class="pun">-</span><span class="pln">t data</span><span class="pun">/</span><span class="pln">housing</span><span class="pun">.</span><span class="pln">test</span><span class="pun">.</span><span class="pln">npy</span></code></li></ol></pre><div class="md-section-divider"></div><h2 data-anchor-id="p7c3" id="总结">总结</h2><p data-anchor-id="3chd">在这章里,我们借助波士顿房价这一数据集,介绍了线性回归模型的基本概念,以及如何使用PaddlePaddle实现训练和测试的过程。很多的模型和技巧都是从简单的线性回归模型演化而来,因此弄清楚线性模型的原理和局限非常重要。</p><div class="md-section-divider"></div><h2 data-anchor-id="r5es" id="参考文献">参考文献</h2><ol data-anchor-id="2umn">
<li><a href="https://en.wikipedia.org/wiki/Linear_regression" target="_blank">https://en.wikipedia.org/wiki/Linear_regression</a></li>
<li>Friedman J, Hastie T, Tibshirani R. The elements of statistical learning[M]. Springer, Berlin: Springer series in statistics, 2001.</li>
<li>Murphy K P. Machine learning: a probabilistic perspective[M]. MIT press, 2012.</li>
<li>Bishop C M. Pattern recognition[J]. Machine Learning, 2006, 128.</li>
</ol><p data-anchor-id="x9ai"><br> <br>
<img src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" alt="知识共享许可协议"> <br>
本教程由<a href="http://book.paddlepaddle.org" target="_blank">PaddlePaddle</a>创作,采用<a href="http://creativecommons.org/licenses/by-nc-sa/4.0/" target="_blank">知识共享 署名-非商业性使用-相同方式共享 4.0 国际 许可协议</a>进行许可。</p></div>
</body>
</html>