library(AppliedPredictiveModeling)
data(permeability)
kable(head(permeability))
| permeability |
|---|
| 12.520 |
| 1.120 |
| 19.405 |
| 1.730 |
| 1.680 |
| 0.510 |
kable(head(fingerprints))
| X1 | X2 | X3 | X4 | X5 | X6 | X7 | X8 | X9 | X10 | X11 | X12 | X13 | X14 | X15 | X16 | X17 | X18 | X19 | X20 | X21 | X22 | X23 | X24 | X25 | X26 | X27 | X28 | X29 | X30 | X31 | X32 | X33 | X34 | X35 | X36 | X37 | X38 | X39 | X40 | X41 | X42 | X43 | X44 | X45 | X46 | X47 | X48 | X49 | X50 | X51 | X52 | X53 | X54 | X55 | X56 | X57 | X58 | X59 | X60 | X61 | X62 | X63 | X64 | X65 | X66 | X67 | X68 | X69 | X70 | X71 | X72 | X73 | X74 | X75 | X76 | X77 | X78 | X79 | X80 | X81 | X82 | X83 | X84 | X85 | X86 | X87 | X88 | X89 | X90 | X91 | X92 | X93 | X94 | X95 | X96 | X97 | X98 | X99 | X100 | X101 | X102 | X103 | X104 | X105 | X106 | X107 | X108 | X109 | X110 | X111 | X112 | X113 | X114 | X115 | X116 | X117 | X118 | X119 | X120 | X121 | X122 | X123 | X124 | X125 | X126 | X127 | X128 | X129 | X130 | X131 | X132 | X133 | X134 | X135 | X136 | X137 | X138 | X139 | X140 | X141 | X142 | X143 | X144 | X145 | X146 | X147 | X148 | X149 | X150 | X151 | X152 | X153 | X154 | X155 | X156 | X157 | X158 | X159 | X160 | X161 | X162 | X163 | X164 | X165 | X166 | X167 | X168 | X169 | X170 | X171 | X172 | X173 | X174 | X175 | X176 | X177 | X178 | X179 | X180 | X181 | X182 | X183 | X184 | X185 | X186 | X187 | X188 | X189 | X190 | X191 | X192 | X193 | X194 | X195 | X196 | X197 | X198 | X199 | X200 | X201 | X202 | X203 | X204 | X205 | X206 | X207 | X208 | X209 | X210 | X211 | X212 | X213 | X214 | X215 | X216 | X217 | X218 | X219 | X220 | X221 | X222 | X223 | X224 | X225 | X226 | X227 | X228 | X229 | X230 | X231 | X232 | X233 | X234 | X235 | X236 | X237 | X238 | X239 | X240 | X241 | X242 | X243 | X244 | X245 | X246 | X247 | X248 | X249 | X250 | X251 | X252 | X253 | X254 | X255 | X256 | X257 | X258 | X259 | X260 | X261 | X262 | X263 | X264 | X265 | X266 | X267 | X268 | X269 | X270 | X271 | X272 | X273 | X274 | X275 | X276 | X277 | X278 | X279 | X280 | X281 | X282 | X283 | X284 | X285 | X286 | X287 | X288 | X289 | X290 | X291 | X292 | X293 | X294 | X295 | X296 | X297 | X298 | X299 | X300 | X301 | X302 | X303 | X304 | X305 | X306 | X307 | X308 | X309 | X310 | X311 | X312 | X313 | X314 | X315 | X316 | X317 | X318 | X319 | X320 | X321 | X322 | X323 | X324 | X325 | X326 | X327 | X328 | X329 | X330 | X331 | X332 | X333 | X334 | X335 | X336 | X337 | X338 | X339 | X340 | X341 | X342 | X343 | X344 | X345 | X346 | X347 | X348 | X349 | X350 | X351 | X352 | X353 | X354 | X355 | X356 | X357 | X358 | X359 | X360 | X361 | X362 | X363 | X364 | X365 | X366 | X367 | X368 | X369 | X370 | X371 | X372 | X373 | X374 | X375 | X376 | X377 | X378 | X379 | X380 | X381 | X382 | X383 | X384 | X385 | X386 | X387 | X388 | X389 | X390 | X391 | X392 | X393 | X394 | X395 | X396 | X397 | X398 | X399 | X400 | X401 | X402 | X403 | X404 | X405 | X406 | X407 | X408 | X409 | X410 | X411 | X412 | X413 | X414 | X415 | X416 | X417 | X418 | X419 | X420 | X421 | X422 | X423 | X424 | X425 | X426 | X427 | X428 | X429 | X430 | X431 | X432 | X433 | X434 | X435 | X436 | X437 | X438 | X439 | X440 | X441 | X442 | X443 | X444 | X445 | X446 | X447 | X448 | X449 | X450 | X451 | X452 | X453 | X454 | X455 | X456 | X457 | X458 | X459 | X460 | X461 | X462 | X463 | X464 | X465 | X466 | X467 | X468 | X469 | X470 | X471 | X472 | X473 | X474 | X475 | X476 | X477 | X478 | X479 | X480 | X481 | X482 | X483 | X484 | X485 | X486 | X487 | X488 | X489 | X490 | X491 | X492 | X493 | X494 | X495 | X496 | X497 | X498 | X499 | X500 | X501 | X502 | X503 | X504 | X505 | X506 | X507 | X508 | X509 | X510 | X511 | X512 | X513 | X514 | X515 | X516 | X517 | X518 | X519 | X520 | X521 | X522 | X523 | X524 | X525 | X526 | X527 | X528 | X529 | X530 | X531 | X532 | X533 | X534 | X535 | X536 | X537 | X538 | X539 | X540 | X541 | X542 | X543 | X544 | X545 | X546 | X547 | X548 | X549 | X550 | X551 | X552 | X553 | X554 | X555 | X556 | X557 | X558 | X559 | X560 | X561 | X562 | X563 | X564 | X565 | X566 | X567 | X568 | X569 | X570 | X571 | X572 | X573 | X574 | X575 | X576 | X577 | X578 | X579 | X580 | X581 | X582 | X583 | X584 | X585 | X586 | X587 | X588 | X589 | X590 | X591 | X592 | X593 | X594 | X595 | X596 | X597 | X598 | X599 | X600 | X601 | X602 | X603 | X604 | X605 | X606 | X607 | X608 | X609 | X610 | X611 | X612 | X613 | X614 | X615 | X616 | X617 | X618 | X619 | X620 | X621 | X622 | X623 | X624 | X625 | X626 | X627 | X628 | X629 | X630 | X631 | X632 | X633 | X634 | X635 | X636 | X637 | X638 | X639 | X640 | X641 | X642 | X643 | X644 | X645 | X646 | X647 | X648 | X649 | X650 | X651 | X652 | X653 | X654 | X655 | X656 | X657 | X658 | X659 | X660 | X661 | X662 | X663 | X664 | X665 | X666 | X667 | X668 | X669 | X670 | X671 | X672 | X673 | X674 | X675 | X676 | X677 | X678 | X679 | X680 | X681 | X682 | X683 | X684 | X685 | X686 | X687 | X688 | X689 | X690 | X691 | X692 | X693 | X694 | X695 | X696 | X697 | X698 | X699 | X700 | X701 | X702 | X703 | X704 | X705 | X706 | X707 | X708 | X709 | X710 | X711 | X712 | X713 | X714 | X715 | X716 | X717 | X718 | X719 | X720 | X721 | X722 | X723 | X724 | X725 | X726 | X727 | X728 | X729 | X730 | X731 | X732 | X733 | X734 | X735 | X736 | X737 | X738 | X739 | X740 | X741 | X742 | X743 | X744 | X745 | X746 | X747 | X748 | X749 | X750 | X751 | X752 | X753 | X754 | X755 | X756 | X757 | X758 | X759 | X760 | X761 | X762 | X763 | X764 | X765 | X766 | X767 | X768 | X769 | X770 | X771 | X772 | X773 | X774 | X775 | X776 | X777 | X778 | X779 | X780 | X781 | X782 | X783 | X784 | X785 | X786 | X787 | X788 | X789 | X790 | X791 | X792 | X793 | X794 | X795 | X796 | X797 | X798 | X799 | X800 | X801 | X802 | X803 | X804 | X805 | X806 | X807 | X808 | X809 | X810 | X811 | X812 | X813 | X814 | X815 | X816 | X817 | X818 | X819 | X820 | X821 | X822 | X823 | X824 | X825 | X826 | X827 | X828 | X829 | X830 | X831 | X832 | X833 | X834 | X835 | X836 | X837 | X838 | X839 | X840 | X841 | X842 | X843 | X844 | X845 | X846 | X847 | X848 | X849 | X850 | X851 | X852 | X853 | X854 | X855 | X856 | X857 | X858 | X859 | X860 | X861 | X862 | X863 | X864 | X865 | X866 | X867 | X868 | X869 | X870 | X871 | X872 | X873 | X874 | X875 | X876 | X877 | X878 | X879 | X880 | X881 | X882 | X883 | X884 | X885 | X886 | X887 | X888 | X889 | X890 | X891 | X892 | X893 | X894 | X895 | X896 | X897 | X898 | X899 | X900 | X901 | X902 | X903 | X904 | X905 | X906 | X907 | X908 | X909 | X910 | X911 | X912 | X913 | X914 | X915 | X916 | X917 | X918 | X919 | X920 | X921 | X922 | X923 | X924 | X925 | X926 | X927 | X928 | X929 | X930 | X931 | X932 | X933 | X934 | X935 | X936 | X937 | X938 | X939 | X940 | X941 | X942 | X943 | X944 | X945 | X946 | X947 | X948 | X949 | X950 | X951 | X952 | X953 | X954 | X955 | X956 | X957 | X958 | X959 | X960 | X961 | X962 | X963 | X964 | X965 | X966 | X967 | X968 | X969 | X970 | X971 | X972 | X973 | X974 | X975 | X976 | X977 | X978 | X979 | X980 | X981 | X982 | X983 | X984 | X985 | X986 | X987 | X988 | X989 | X990 | X991 | X992 | X993 | X994 | X995 | X996 | X997 | X998 | X999 | X1000 | X1001 | X1002 | X1003 | X1004 | X1005 | X1006 | X1007 | X1008 | X1009 | X1010 | X1011 | X1012 | X1013 | X1014 | X1015 | X1016 | X1017 | X1018 | X1019 | X1020 | X1021 | X1022 | X1023 | X1024 | X1025 | X1026 | X1027 | X1028 | X1029 | X1030 | X1031 | X1032 | X1033 | X1034 | X1035 | X1036 | X1037 | X1038 | X1039 | X1040 | X1041 | X1042 | X1043 | X1044 | X1045 | X1046 | X1047 | X1048 | X1049 | X1050 | X1051 | X1052 | X1053 | X1054 | X1055 | X1056 | X1057 | X1058 | X1059 | X1060 | X1061 | X1062 | X1063 | X1064 | X1065 | X1066 | X1067 | X1068 | X1069 | X1070 | X1071 | X1072 | X1073 | X1074 | X1075 | X1076 | X1077 | X1078 | X1079 | X1080 | X1081 | X1082 | X1083 | X1084 | X1085 | X1086 | X1087 | X1088 | X1089 | X1090 | X1091 | X1092 | X1093 | X1094 | X1095 | X1096 | X1097 | X1098 | X1099 | X1100 | X1101 | X1102 | X1103 | X1104 | X1105 | X1106 | X1107 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Clearly the majority of predictors have near zero variance and are thus removed, taking our total number of predictors from 1107 to 388 which we will use for modeling.
nzv <- fingerprints |>
nearZeroVar()
dim(fingerprints)
## [1] 165 1107
filtered_var <- fingerprints[, -nzv] |>
as.data.frame()
dim(filtered_var)
## [1] 165 388
# Splitting the data into test and training sets
set.seed(8675309)
train_index <- createDataPartition(permeability, p = 0.8, list = FALSE)
finger_train <- filtered_var[ train_index,]
finger_test <- filtered_var[-train_index,]
perm_train <- permeability[train_index,]
perm_test <- permeability[-train_index,]
Since the predictors are already dummy variables and we have removed the predictors with near zero variance the only other reasonable preprocessing would be to remove highly correlated predictors. Because pls acctually does well with correlated predictors; however, we will leave them all in and simply train the model. Doing so gives us an optimal number of components of 8 with \(R^2 = 0.5404078\).
set.seed(8675309)
fitControl <- trainControl(method = "repeatedcv", number = 10, repeats = 10)
pls_fit <- train(finger_train, perm_train, method = "pls", trControl = fitControl, tuneLength = 15)
pls_fit
## Partial Least Squares
##
## 133 samples
## 388 predictors
##
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 120, 120, 120, 120, 119, 120, ...
## Resampling results across tuning parameters:
##
## ncomp RMSE Rsquared MAE
## 1 13.45032 0.3068522 10.264160
## 2 11.72986 0.4658254 8.493874
## 3 11.42642 0.4949485 8.700251
## 4 11.31142 0.5177391 8.839199
## 5 11.18169 0.5342541 8.542068
## 6 11.23515 0.5321200 8.715810
## 7 11.14735 0.5378507 8.685742
## 8 11.07303 0.5404078 8.655189
## 9 11.16096 0.5310192 8.568932
## 10 11.32340 0.5253912 8.719123
## 11 11.45074 0.5182107 8.753112
## 12 11.49753 0.5170925 8.739608
## 13 11.57492 0.5135417 8.830539
## 14 11.69269 0.5107952 8.942786
## 15 11.84706 0.5038379 9.045266
##
## RMSE was used to select the optimal model using the smallest value.
## The final value used for the model was ncomp = 8.
plot(pls_fit)
The test set estimate of \(R^2\) is 0.4811436.
set.seed(8675309)
prediction <- predict(pls_fit, finger_test)
postResample(prediction,perm_test)
## RMSE Rsquared MAE
## 11.7652606 0.4811436 8.4306632
We will be modeling using PCR and enet and comparing to our previous PLS model.
The PCR model selects 25 components, significantly more than the PLS model. The test set estimate of \(R^2\) is 0.4382963 which is not better than the PLS model. The RMSE and MAE are also both worse.
# Tune length was chosen by fitting various lengths and picking one near the optimal to generate a meaningful plot
set.seed(8675309)
pcr_fit <- train(finger_train, perm_train, method = "pcr", trControl = fitControl, tuneLength = 30)
pcr_fit
## Principal Component Analysis
##
## 133 samples
## 388 predictors
##
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 120, 120, 120, 120, 119, 120, ...
## Resampling results across tuning parameters:
##
## ncomp RMSE Rsquared MAE
## 1 14.94452 0.1302255 11.489097
## 2 14.93895 0.1295192 11.609228
## 3 14.43777 0.2325498 11.039941
## 4 12.29320 0.4171461 9.047848
## 5 12.30537 0.4156420 9.036770
## 6 12.37013 0.4097093 9.086530
## 7 12.22224 0.4185155 8.971397
## 8 12.24337 0.4192296 9.058046
## 9 12.23218 0.4223463 8.989764
## 10 11.73856 0.4651432 8.863846
## 11 11.71158 0.4693896 8.848716
## 12 11.69125 0.4719666 8.860811
## 13 11.70943 0.4719551 8.792600
## 14 11.79427 0.4647456 8.940193
## 15 11.87543 0.4596853 9.032746
## 16 11.93606 0.4549557 9.055786
## 17 11.97185 0.4586595 9.127666
## 18 11.63625 0.4889297 9.037611
## 19 11.56673 0.4948309 8.925449
## 20 11.36033 0.5094339 8.709840
## 21 11.23324 0.5244596 8.602733
## 22 11.28168 0.5209965 8.646537
## 23 11.27130 0.5210863 8.674522
## 24 11.22181 0.5249124 8.606969
## 25 11.06278 0.5384437 8.533718
## 26 11.12259 0.5355603 8.623845
## 27 11.22046 0.5257694 8.726324
## 28 11.25410 0.5251071 8.730189
## 29 11.36132 0.5175185 8.770012
## 30 11.37886 0.5156507 8.794975
##
## RMSE was used to select the optimal model using the smallest value.
## The final value used for the model was ncomp = 25.
plot(pcr_fit)
set.seed(8675309)
pcr_prediction <- predict(pcr_fit, finger_test)
postResample(pcr_prediction,perm_test)
## RMSE Rsquared MAE
## 12.1840088 0.4382963 8.6010111
The enet model generated an estimate for \(R^2\) on the test data of 0.4179957, worse than both the PCR and PLS models. The training data estimated a much better \(R^2\) than was found on the test data, indicating a potential overfit.
set.seed(8675309)
enetGrid <- expand.grid(.lambda = c(0, 0.01, .1), .fraction = seq(.05, 1, length = 20))
enet_fit <- train(finger_train, perm_train, method = "enet", tuneGrid = enetGrid, trControl = fitControl)
enet_fit
## Elasticnet
##
## 133 samples
## 388 predictors
##
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 120, 120, 120, 120, 119, 120, ...
## Resampling results across tuning parameters:
##
## lambda fraction RMSE Rsquared MAE
## 0.00 0.05 1.116719e+20 0.2557061 5.030176e+19
## 0.00 0.10 2.233421e+20 0.2774669 1.006040e+20
## 0.00 0.15 3.350124e+20 0.2901107 1.509063e+20
## 0.00 0.20 4.466826e+20 0.2939524 2.012085e+20
## 0.00 0.25 5.583529e+20 0.2918357 2.515108e+20
## 0.00 0.30 6.700232e+20 0.2865417 3.018130e+20
## 0.00 0.35 7.816935e+20 0.2809954 3.521153e+20
## 0.00 0.40 8.933638e+20 0.2775050 4.024176e+20
## 0.00 0.45 1.005034e+21 0.2729495 4.527198e+20
## 0.00 0.50 1.116704e+21 0.2699196 5.030221e+20
## 0.00 0.55 1.228375e+21 0.2665327 5.533243e+20
## 0.00 0.60 1.340045e+21 0.2621094 6.036266e+20
## 0.00 0.65 1.451715e+21 0.2577248 6.539288e+20
## 0.00 0.70 1.563385e+21 0.2545939 7.042311e+20
## 0.00 0.75 1.675056e+21 0.2513110 7.545334e+20
## 0.00 0.80 1.786726e+21 0.2477257 8.048356e+20
## 0.00 0.85 1.898396e+21 0.2443250 8.551379e+20
## 0.00 0.90 2.010067e+21 0.2414842 9.054401e+20
## 0.00 0.95 2.121737e+21 0.2387279 9.557424e+20
## 0.00 1.00 2.233407e+21 0.2361269 1.006045e+21
## 0.01 0.05 1.767460e+01 0.4354336 1.174087e+01
## 0.01 0.10 2.308645e+01 0.4467463 1.477776e+01
## 0.01 0.15 2.838789e+01 0.4785230 1.780175e+01
## 0.01 0.20 3.427563e+01 0.5024158 2.089285e+01
## 0.01 0.25 3.933668e+01 0.5145348 2.368867e+01
## 0.01 0.30 4.443684e+01 0.5165564 2.680902e+01
## 0.01 0.35 4.973114e+01 0.5123344 3.008295e+01
## 0.01 0.40 5.500654e+01 0.5032206 3.334968e+01
## 0.01 0.45 6.030008e+01 0.4931362 3.660409e+01
## 0.01 0.50 6.555967e+01 0.4851627 3.983482e+01
## 0.01 0.55 7.079500e+01 0.4783941 4.304351e+01
## 0.01 0.60 7.599681e+01 0.4731252 4.623085e+01
## 0.01 0.65 8.113107e+01 0.4672680 4.941653e+01
## 0.01 0.70 8.624234e+01 0.4622470 5.262382e+01
## 0.01 0.75 9.141281e+01 0.4565942 5.586806e+01
## 0.01 0.80 9.663973e+01 0.4498057 5.914188e+01
## 0.01 0.85 1.018511e+02 0.4431560 6.242125e+01
## 0.01 0.90 1.069965e+02 0.4374275 6.566316e+01
## 0.01 0.95 1.121145e+02 0.4332081 6.887954e+01
## 0.01 1.00 1.172171e+02 0.4296871 7.207962e+01
## 0.10 0.05 1.241433e+01 0.4593154 9.408323e+00
## 0.10 0.10 1.202731e+01 0.4503048 8.513856e+00
## 0.10 0.15 1.195183e+01 0.4569286 8.472374e+00
## 0.10 0.20 1.170856e+01 0.4755882 8.442923e+00
## 0.10 0.25 1.141508e+01 0.4974156 8.336944e+00
## 0.10 0.30 1.116922e+01 0.5160029 8.227671e+00
## 0.10 0.35 1.099723e+01 0.5291567 8.133583e+00
## 0.10 0.40 1.093222e+01 0.5351416 8.089523e+00
## 0.10 0.45 1.089591e+01 0.5397881 8.071444e+00
## 0.10 0.50 1.086865e+01 0.5442330 8.071962e+00
## 0.10 0.55 1.084794e+01 0.5485689 8.075296e+00
## 0.10 0.60 1.086086e+01 0.5510264 8.101143e+00
## 0.10 0.65 1.091406e+01 0.5508553 8.149136e+00
## 0.10 0.70 1.099229e+01 0.5490070 8.223541e+00
## 0.10 0.75 1.107765e+01 0.5467707 8.307213e+00
## 0.10 0.80 1.116311e+01 0.5446891 8.393053e+00
## 0.10 0.85 1.124640e+01 0.5427049 8.474880e+00
## 0.10 0.90 1.132407e+01 0.5408338 8.543844e+00
## 0.10 0.95 1.140106e+01 0.5387098 8.607216e+00
## 0.10 1.00 1.148380e+01 0.5361322 8.673480e+00
##
## RMSE was used to select the optimal model using the smallest value.
## The final values used for the model were fraction = 0.55 and lambda = 0.1.
plot(enet_fit)
set.seed(8675309)
enet_prediction <- predict(enet_fit, finger_test)
postResample(enet_prediction,perm_test)
## RMSE Rsquared MAE
## 13.0070860 0.4179957 9.4941477
Ultimately this a bit of an odd question as, in a medical research context, a model would never “replace” a test of such a critical component of a drugs performance. Experimental results would be required by any regulatory body regardless of how confident any model is. In the spirit of the question, no I would not recommend using any of the models to completey replace the experimental process as none have a high enough \(R^2\) to provide the confidence required in a medical setting.
data(ChemicalManufacturingProcess)
kable(head(ChemicalManufacturingProcess))
| Yield | BiologicalMaterial01 | BiologicalMaterial02 | BiologicalMaterial03 | BiologicalMaterial04 | BiologicalMaterial05 | BiologicalMaterial06 | BiologicalMaterial07 | BiologicalMaterial08 | BiologicalMaterial09 | BiologicalMaterial10 | BiologicalMaterial11 | BiologicalMaterial12 | ManufacturingProcess01 | ManufacturingProcess02 | ManufacturingProcess03 | ManufacturingProcess04 | ManufacturingProcess05 | ManufacturingProcess06 | ManufacturingProcess07 | ManufacturingProcess08 | ManufacturingProcess09 | ManufacturingProcess10 | ManufacturingProcess11 | ManufacturingProcess12 | ManufacturingProcess13 | ManufacturingProcess14 | ManufacturingProcess15 | ManufacturingProcess16 | ManufacturingProcess17 | ManufacturingProcess18 | ManufacturingProcess19 | ManufacturingProcess20 | ManufacturingProcess21 | ManufacturingProcess22 | ManufacturingProcess23 | ManufacturingProcess24 | ManufacturingProcess25 | ManufacturingProcess26 | ManufacturingProcess27 | ManufacturingProcess28 | ManufacturingProcess29 | ManufacturingProcess30 | ManufacturingProcess31 | ManufacturingProcess32 | ManufacturingProcess33 | ManufacturingProcess34 | ManufacturingProcess35 | ManufacturingProcess36 | ManufacturingProcess37 | ManufacturingProcess38 | ManufacturingProcess39 | ManufacturingProcess40 | ManufacturingProcess41 | ManufacturingProcess42 | ManufacturingProcess43 | ManufacturingProcess44 | ManufacturingProcess45 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 38.00 | 6.25 | 49.58 | 56.97 | 12.74 | 19.51 | 43.73 | 100 | 16.66 | 11.44 | 3.46 | 138.09 | 18.83 | NA | NA | NA | NA | NA | NA | NA | NA | 43.00 | NA | NA | NA | 35.5 | 4898 | 6108 | 4682 | 35.5 | 4865 | 6049 | 4665 | 0.0 | NA | NA | NA | 4873 | 6074 | 4685 | 10.7 | 21.0 | 9.9 | 69.1 | 156 | 66 | 2.4 | 486 | 0.019 | 0.5 | 3 | 7.2 | NA | NA | 11.6 | 3.0 | 1.8 | 2.4 |
| 42.44 | 8.01 | 60.97 | 67.48 | 14.65 | 19.36 | 53.14 | 100 | 19.04 | 12.55 | 3.46 | 153.67 | 21.05 | 0.0 | 0 | NA | 917 | 1032.2 | 210.0 | 177 | 178 | 46.57 | NA | NA | 0 | 34.0 | 4869 | 6095 | 4617 | 34.0 | 4867 | 6097 | 4621 | 0.0 | 3 | 0 | 3 | 4869 | 6107 | 4630 | 11.2 | 21.4 | 9.9 | 68.7 | 169 | 66 | 2.6 | 508 | 0.019 | 2.0 | 2 | 7.2 | 0.1 | 0.15 | 11.1 | 0.9 | 1.9 | 2.2 |
| 42.03 | 8.01 | 60.97 | 67.48 | 14.65 | 19.36 | 53.14 | 100 | 19.04 | 12.55 | 3.46 | 153.67 | 21.05 | 0.0 | 0 | NA | 912 | 1003.6 | 207.1 | 178 | 178 | 45.07 | NA | NA | 0 | 34.8 | 4878 | 6087 | 4617 | 34.8 | 4877 | 6078 | 4621 | 0.0 | 4 | 1 | 4 | 4897 | 6116 | 4637 | 11.1 | 21.3 | 9.4 | 69.3 | 173 | 66 | 2.6 | 509 | 0.018 | 0.7 | 2 | 7.2 | 0.0 | 0.00 | 12.0 | 1.0 | 1.8 | 2.3 |
| 41.42 | 8.01 | 60.97 | 67.48 | 14.65 | 19.36 | 53.14 | 100 | 19.04 | 12.55 | 3.46 | 153.67 | 21.05 | 0.0 | 0 | NA | 911 | 1014.6 | 213.3 | 177 | 177 | 44.92 | NA | NA | 0 | 34.8 | 4897 | 6102 | 4635 | 34.8 | 4872 | 6073 | 4611 | 0.0 | 5 | 2 | 5 | 4892 | 6111 | 4630 | 11.1 | 21.3 | 9.4 | 69.3 | 171 | 68 | 2.5 | 496 | 0.018 | 1.2 | 2 | 7.2 | 0.0 | 0.00 | 10.6 | 1.1 | 1.8 | 2.1 |
| 42.49 | 7.47 | 63.33 | 72.25 | 14.02 | 17.91 | 54.66 | 100 | 18.22 | 12.80 | 3.05 | 147.61 | 21.05 | 10.7 | 0 | NA | 918 | 1027.5 | 205.7 | 178 | 178 | 44.96 | NA | NA | 0 | 34.6 | 4992 | 6233 | 4733 | 33.9 | 4886 | 6102 | 4659 | -0.7 | 8 | 4 | 18 | 4930 | 6151 | 4684 | 11.3 | 21.6 | 9.0 | 69.4 | 171 | 70 | 2.5 | 468 | 0.017 | 0.2 | 2 | 7.3 | 0.0 | 0.00 | 11.0 | 1.1 | 1.7 | 2.1 |
| 43.57 | 6.12 | 58.36 | 65.31 | 15.17 | 21.79 | 51.23 | 100 | 18.30 | 12.13 | 3.78 | 151.88 | 20.76 | 12.0 | 0 | NA | 924 | 1016.8 | 208.9 | 178 | 178 | 45.32 | NA | NA | 0 | 34.0 | 4985 | 6222 | 4786 | 33.4 | 4862 | 6115 | 4696 | -0.6 | 9 | 1 | 1 | 4871 | 6128 | 4687 | 11.4 | 21.7 | 10.1 | 68.2 | 173 | 70 | 2.5 | 490 | 0.018 | 0.4 | 2 | 7.2 | 0.0 | 0.00 | 11.5 | 2.2 | 1.8 | 2.0 |
set.seed(8675309)
kable(colSums(is.na(ChemicalManufacturingProcess)))
| x | |
|---|---|
| Yield | 0 |
| BiologicalMaterial01 | 0 |
| BiologicalMaterial02 | 0 |
| BiologicalMaterial03 | 0 |
| BiologicalMaterial04 | 0 |
| BiologicalMaterial05 | 0 |
| BiologicalMaterial06 | 0 |
| BiologicalMaterial07 | 0 |
| BiologicalMaterial08 | 0 |
| BiologicalMaterial09 | 0 |
| BiologicalMaterial10 | 0 |
| BiologicalMaterial11 | 0 |
| BiologicalMaterial12 | 0 |
| ManufacturingProcess01 | 1 |
| ManufacturingProcess02 | 3 |
| ManufacturingProcess03 | 15 |
| ManufacturingProcess04 | 1 |
| ManufacturingProcess05 | 1 |
| ManufacturingProcess06 | 2 |
| ManufacturingProcess07 | 1 |
| ManufacturingProcess08 | 1 |
| ManufacturingProcess09 | 0 |
| ManufacturingProcess10 | 9 |
| ManufacturingProcess11 | 10 |
| ManufacturingProcess12 | 1 |
| ManufacturingProcess13 | 0 |
| ManufacturingProcess14 | 1 |
| ManufacturingProcess15 | 0 |
| ManufacturingProcess16 | 0 |
| ManufacturingProcess17 | 0 |
| ManufacturingProcess18 | 0 |
| ManufacturingProcess19 | 0 |
| ManufacturingProcess20 | 0 |
| ManufacturingProcess21 | 0 |
| ManufacturingProcess22 | 1 |
| ManufacturingProcess23 | 1 |
| ManufacturingProcess24 | 1 |
| ManufacturingProcess25 | 5 |
| ManufacturingProcess26 | 5 |
| ManufacturingProcess27 | 5 |
| ManufacturingProcess28 | 5 |
| ManufacturingProcess29 | 5 |
| ManufacturingProcess30 | 5 |
| ManufacturingProcess31 | 5 |
| ManufacturingProcess32 | 0 |
| ManufacturingProcess33 | 5 |
| ManufacturingProcess34 | 5 |
| ManufacturingProcess35 | 5 |
| ManufacturingProcess36 | 5 |
| ManufacturingProcess37 | 0 |
| ManufacturingProcess38 | 0 |
| ManufacturingProcess39 | 0 |
| ManufacturingProcess40 | 1 |
| ManufacturingProcess41 | 1 |
| ManufacturingProcess42 | 0 |
| ManufacturingProcess43 | 0 |
| ManufacturingProcess44 | 0 |
| ManufacturingProcess45 | 0 |
# Apply BoxCox, center, and scale the imputation model to get better imputation values
impute_model <- preProcess(ChemicalManufacturingProcess,
method = c("knnImpute", "BoxCox", "center", "scale"),
k = 5)
imputed_chemicals <- predict(impute_model, ChemicalManufacturingProcess)
# Validate all values were actually imputed
kable(colSums(is.na(imputed_chemicals)))
| x | |
|---|---|
| Yield | 0 |
| BiologicalMaterial01 | 0 |
| BiologicalMaterial02 | 0 |
| BiologicalMaterial03 | 0 |
| BiologicalMaterial04 | 0 |
| BiologicalMaterial05 | 0 |
| BiologicalMaterial06 | 0 |
| BiologicalMaterial07 | 0 |
| BiologicalMaterial08 | 0 |
| BiologicalMaterial09 | 0 |
| BiologicalMaterial10 | 0 |
| BiologicalMaterial11 | 0 |
| BiologicalMaterial12 | 0 |
| ManufacturingProcess01 | 0 |
| ManufacturingProcess02 | 0 |
| ManufacturingProcess03 | 0 |
| ManufacturingProcess04 | 0 |
| ManufacturingProcess05 | 0 |
| ManufacturingProcess06 | 0 |
| ManufacturingProcess07 | 0 |
| ManufacturingProcess08 | 0 |
| ManufacturingProcess09 | 0 |
| ManufacturingProcess10 | 0 |
| ManufacturingProcess11 | 0 |
| ManufacturingProcess12 | 0 |
| ManufacturingProcess13 | 0 |
| ManufacturingProcess14 | 0 |
| ManufacturingProcess15 | 0 |
| ManufacturingProcess16 | 0 |
| ManufacturingProcess17 | 0 |
| ManufacturingProcess18 | 0 |
| ManufacturingProcess19 | 0 |
| ManufacturingProcess20 | 0 |
| ManufacturingProcess21 | 0 |
| ManufacturingProcess22 | 0 |
| ManufacturingProcess23 | 0 |
| ManufacturingProcess24 | 0 |
| ManufacturingProcess25 | 0 |
| ManufacturingProcess26 | 0 |
| ManufacturingProcess27 | 0 |
| ManufacturingProcess28 | 0 |
| ManufacturingProcess29 | 0 |
| ManufacturingProcess30 | 0 |
| ManufacturingProcess31 | 0 |
| ManufacturingProcess32 | 0 |
| ManufacturingProcess33 | 0 |
| ManufacturingProcess34 | 0 |
| ManufacturingProcess35 | 0 |
| ManufacturingProcess36 | 0 |
| ManufacturingProcess37 | 0 |
| ManufacturingProcess38 | 0 |
| ManufacturingProcess39 | 0 |
| ManufacturingProcess40 | 0 |
| ManufacturingProcess41 | 0 |
| ManufacturingProcess42 | 0 |
| ManufacturingProcess43 | 0 |
| ManufacturingProcess44 | 0 |
| ManufacturingProcess45 | 0 |
After testing the below three types of models we will be choosing the enet model as it has both the lowest error values and highest \(R^2\) of the three.
The PLS model chosen has 4 components, \(RMSE = 0.7692694\), \(R^2 = 0.5441347\), and \(MAE = 0.5859477\).
set.seed(8675309)
train_chem_index <- createDataPartition(imputed_chemicals$Yield, p = 0.8, list = FALSE)
train_chem <- imputed_chemicals[train_chem_index,]
test_chem <- imputed_chemicals[-train_chem_index,]
pls_chem_fit <- train(Yield ~ ., data = train_chem,
method = "pls",
# Use R^2 due to importance of "correct" prediction
metric = "Rsquared",
trControl = fitControl,
# Long tune length
tuneLength = 25,
# Pre-process steps remove near zero variance, center, and scale data
preProcess = c("nzv", "BoxCox", "center", "scale"))
pls_chem_fit
## Partial Least Squares
##
## 144 samples
## 57 predictor
##
## Pre-processing: centered (56), scaled (56), remove (1)
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 130, 129, 129, 130, 130, 130, ...
## Resampling results across tuning parameters:
##
## ncomp RMSE Rsquared MAE
## 1 0.8573333 0.3977896 0.6662061
## 2 1.1372277 0.4556140 0.6989686
## 3 0.7949692 0.5388893 0.5919724
## 4 0.7692694 0.5441347 0.5859477
## 5 0.8644005 0.5268057 0.6187248
## 6 0.9054987 0.5194737 0.6308294
## 7 0.9442143 0.5085414 0.6437042
## 8 1.0201581 0.4982856 0.6692334
## 9 1.0364433 0.5045110 0.6759548
## 10 1.1014477 0.4994673 0.6951501
## 11 1.0852462 0.5019302 0.6922901
## 12 1.0746635 0.4996695 0.6875667
## 13 1.0969968 0.4983273 0.6954353
## 14 1.1123629 0.4983298 0.7008356
## 15 1.1322295 0.4945087 0.7087188
## 16 1.1317068 0.4972306 0.7056862
## 17 1.1371062 0.5052760 0.7022985
## 18 1.1553352 0.5046847 0.7076951
## 19 1.1708520 0.5044084 0.7129810
## 20 1.1844577 0.5034885 0.7179581
## 21 1.1803110 0.5056003 0.7178463
## 22 1.1773084 0.5077992 0.7190763
## 23 1.1863732 0.5108239 0.7229453
## 24 1.2138754 0.5050523 0.7347758
## 25 1.2555496 0.4984418 0.7516846
##
## Rsquared was used to select the optimal model using the largest value.
## The final value used for the model was ncomp = 4.
The simple linear regression model has \(RMSE = 0.8806894\), \(R^2 = 0.4931804\), and \(MAE = 0.626732\).
set.seed(8675309)
lm_chem_fit <- train(Yield ~ ., data = train_chem,
method = "lm",
# Use R^2 due to importance of "correct" prediction
metric = "Rsquared",
trControl = fitControl,
# Long tune length
tuneLength = 25,
# Adding pca step for linear regression, pca step automatically centers and scales
preProcess = c("nzv", "BoxCox", "corr", "pca"))
lm_chem_fit
## Linear Regression
##
## 144 samples
## 57 predictor
##
## Pre-processing: principal component signal extraction (47), centered
## (47), scaled (47), remove (10)
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 130, 131, 131, 129, 128, 129, ...
## Resampling results:
##
## RMSE Rsquared MAE
## 0.8806894 0.4931804 0.626732
##
## Tuning parameter 'intercept' was held constant at a value of TRUE
The enet model has \(RMSE = 0.6287157\), \(R^2 = 0.6181578\), and \(MAE = 0.5158498\).
set.seed(8675309)
enet_chem_fit <- train(Yield ~ ., data = train_chem,
method = "enet",
metric = "Rsquared",
tuneGrid = enetGrid,
trControl = fitControl,
preProcess = c("nzv", "BoxCox", "center", "scale"))
enet_chem_fit
## Elasticnet
##
## 144 samples
## 57 predictor
##
## Pre-processing: centered (56), scaled (56), remove (1)
## Resampling: Cross-Validated (10 fold, repeated 10 times)
## Summary of sample sizes: 130, 131, 131, 129, 128, 129, ...
## Resampling results across tuning parameters:
##
## lambda fraction RMSE Rsquared MAE
## 0.00 0.05 0.6457233 0.6060526 0.5383964
## 0.00 0.10 0.6287157 0.6181578 0.5158498
## 0.00 0.15 0.7246710 0.5805725 0.5507908
## 0.00 0.20 0.8236483 0.5430566 0.5879648
## 0.00 0.25 0.8713617 0.5242737 0.6125632
## 0.00 0.30 0.9681727 0.5029801 0.6478553
## 0.00 0.35 1.1305777 0.4825170 0.6974133
## 0.00 0.40 1.3060018 0.4675320 0.7493626
## 0.00 0.45 1.3732652 0.4697420 0.7696846
## 0.00 0.50 1.3497526 0.4653710 0.7678625
## 0.00 0.55 1.3894900 0.4626363 0.7819198
## 0.00 0.60 1.4739459 0.4539308 0.8082388
## 0.00 0.65 1.5391327 0.4490732 0.8280274
## 0.00 0.70 1.6164889 0.4414355 0.8526458
## 0.00 0.75 1.7016605 0.4343163 0.8788640
## 0.00 0.80 1.8044810 0.4281988 0.9100284
## 0.00 0.85 1.9086161 0.4234427 0.9407048
## 0.00 0.90 2.0663401 0.4178373 0.9855105
## 0.00 0.95 2.2316088 0.4112049 1.0334736
## 0.00 1.00 2.3785361 0.4047149 1.0759211
## 0.01 0.05 0.8114331 0.5729199 0.6757470
## 0.01 0.10 0.6908807 0.5884238 0.5799243
## 0.01 0.15 0.6497307 0.5999176 0.5389058
## 0.01 0.20 0.6575376 0.6015881 0.5299374
## 0.01 0.25 0.6855387 0.5983259 0.5352745
## 0.01 0.30 0.7043299 0.5936536 0.5399373
## 0.01 0.35 0.7185005 0.5902423 0.5456497
## 0.01 0.40 0.7281978 0.5909329 0.5483731
## 0.01 0.45 0.7568600 0.5818483 0.5585849
## 0.01 0.50 0.8177450 0.5666609 0.5794095
## 0.01 0.55 0.8975352 0.5469604 0.6071643
## 0.01 0.60 0.9606482 0.5310927 0.6302360
## 0.01 0.65 0.9818359 0.5198197 0.6412172
## 0.01 0.70 0.9901780 0.5126611 0.6468967
## 0.01 0.75 1.0052616 0.5074887 0.6533031
## 0.01 0.80 1.0230088 0.5028807 0.6604215
## 0.01 0.85 1.0421402 0.4994898 0.6670716
## 0.01 0.90 1.0569229 0.4976886 0.6718463
## 0.01 0.95 1.0693272 0.4966092 0.6757423
## 0.01 1.00 1.0796882 0.4963302 0.6794762
## 0.10 0.05 0.8929507 0.5384323 0.7377006
## 0.10 0.10 0.8050374 0.5674056 0.6699016
## 0.10 0.15 0.7361582 0.5757647 0.6152932
## 0.10 0.20 0.6881420 0.5852827 0.5763080
## 0.10 0.25 0.6617566 0.5956991 0.5497992
## 0.10 0.30 0.6508904 0.6018471 0.5343438
## 0.10 0.35 0.6487124 0.6077807 0.5251055
## 0.10 0.40 0.6452835 0.6142528 0.5189745
## 0.10 0.45 0.6440157 0.6161252 0.5162116
## 0.10 0.50 0.6471553 0.6156165 0.5165954
## 0.10 0.55 0.6573023 0.6127256 0.5199684
## 0.10 0.60 0.6813831 0.6069570 0.5283939
## 0.10 0.65 0.7199557 0.5993755 0.5416153
## 0.10 0.70 0.7479348 0.5928979 0.5541474
## 0.10 0.75 0.7695601 0.5846747 0.5657768
## 0.10 0.80 0.7907428 0.5709231 0.5790088
## 0.10 0.85 0.8016701 0.5616477 0.5873800
## 0.10 0.90 0.8147626 0.5536322 0.5950316
## 0.10 0.95 0.8294434 0.5463611 0.6024185
## 0.10 1.00 0.8470722 0.5396539 0.6096558
##
## Rsquared was used to select the optimal model using the largest value.
## The final values used for the model were fraction = 0.1 and lambda = 0.
The \(R^2\) of the test set is extremely close to that of the training set which is an encouraging sign.
set.seed(8675309)
chem_predict <- predict(enet_chem_fit, test_chem)
postResample(chem_predict, test_chem$Yield)
## RMSE Rsquared MAE
## 0.6291565 0.6250653 0.4861769
Looking at the 20 most important predictors we see 13 are manufacturing predictors, however of the top ten predictors 7 are manufacturing. This is a promising sign for those looking to optimize the manufacturing process.
varImp(enet_chem_fit)
## loess r-squared variable importance
##
## only 20 most important variables shown (out of 57)
##
## Overall
## ManufacturingProcess32 100.00
## ManufacturingProcess09 93.79
## BiologicalMaterial06 90.22
## ManufacturingProcess13 87.57
## ManufacturingProcess17 87.45
## ManufacturingProcess36 85.75
## BiologicalMaterial03 85.58
## ManufacturingProcess06 79.27
## ManufacturingProcess11 75.05
## BiologicalMaterial12 72.82
## ManufacturingProcess31 65.77
## BiologicalMaterial02 62.86
## BiologicalMaterial11 53.68
## ManufacturingProcess29 50.92
## BiologicalMaterial04 49.63
## ManufacturingProcess18 48.73
## ManufacturingProcess25 46.19
## ManufacturingProcess33 46.00
## BiologicalMaterial08 44.69
## ManufacturingProcess30 44.61
Looking at the non-zero coefficients we can see that of the top predictors most are positive, which indicates increasing the values of those predictors would benefit yields. Some of the top predictors have negative coefficients, which indicates a need to decrease the values to improve yields. The majority of the non-zero coefficients are for manufacturing predictors indicating the company has a strong influence over the yields they produce.
# Man this was a PAIN there has got to be an easier way I could not find
enet_coef <- predict(enet_chem_fit$finalModel, type = "coefficients", mode = "fraction", s = 0.1)$coefficients
nz_coef <- as_tibble(as.list(enet_coef)) |>
pivot_longer(cols = everything(), names_to = "predictor", values_to = "coefficient") |>
filter(coefficient != 0)
nz_coef |>
arrange(desc(coefficient)) |>
kable()
| predictor | coefficient |
|---|---|
| ManufacturingProcess32 | 0.4416650 |
| ManufacturingProcess09 | 0.2856304 |
| ManufacturingProcess04 | 0.1365651 |
| ManufacturingProcess34 | 0.1286089 |
| BiologicalMaterial06 | 0.1237811 |
| ManufacturingProcess39 | 0.0800793 |
| ManufacturingProcess06 | 0.0706503 |
| ManufacturingProcess11 | 0.0611360 |
| ManufacturingProcess15 | 0.0526565 |
| ManufacturingProcess45 | 0.0415707 |
| ManufacturingProcess43 | 0.0375007 |
| BiologicalMaterial05 | 0.0343207 |
| ManufacturingProcess30 | 0.0168951 |
| ManufacturingProcess03 | -0.0065480 |
| ManufacturingProcess35 | -0.0091956 |
| ManufacturingProcess08 | -0.0416582 |
| ManufacturingProcess13 | -0.0573615 |
| ManufacturingProcess36 | -0.0689907 |
| ManufacturingProcess28 | -0.0747085 |
| ManufacturingProcess37 | -0.0958963 |
| ManufacturingProcess17 | -0.1199865 |