Computes ranking of biomarkers based effect sizes, which are computed by Targeted Minimum Loss-Based Estimation. This function is designed to be called inside adaptest; it should not be run by itself outside of that context.

rank_DE(Y, A, W, absolute = FALSE, negative = FALSE,
  learning_library = c("SL.glm", "SL.step", "SL.glm.interaction",
  "SL.gam"))

Arguments

Y

(numeric vector) - continuous or binary biomarkers outcome variables

A

(numeric vector) - binary treatment indicator: 1 = treatment, 0 = control

W

(numeric vector, numeric matrix, or numeric data.frame) - matrix of baseline covariates where each column corrspond to one baseline covariate. Each row correspond to one observation

absolute

(logical) - whether or not to test for absolute effect size. If FALSE, test for directional effect. This overrides argument negative.

negative

(logical) - whether or not to test for negative effect size. If FALSE = test for positive effect size. This is effective only when absolute = FALSE.

learning_library

(character vector) - library of learning algorithms to be used in fitting the "Q" and "g" step of the standard TMLE procedure.

Value

an integer vector containing ranks of biomarkers.

Examples

set.seed(1234) data(simpleArray) simulated_array <- simulated_array simulated_treatment <- simulated_treatment rank_DE(Y = simulated_array, A = simulated_treatment, W = rep(1, length(simulated_treatment)), absolute = FALSE, negative = FALSE)
#> [1] 6 23 3 4 2 5 10 46 11 1 600 549 744 282 #> [15] 872 490 929 139 461 712 874 931 552 85 628 766 901 643 #> [29] 556 51 932 30 78 317 547 782 798 382 669 451 678 407 #> [43] 251 253 692 254 758 894 97 117 875 672 993 852 755 409 #> [57] 221 665 899 686 180 639 16 480 539 738 384 249 760 909 #> [71] 283 269 641 657 307 268 153 164 293 476 437 630 467 761 #> [85] 257 599 159 379 375 360 680 107 335 865 145 264 136 605 #> [99] 954 638 862 109 667 171 395 780 500 890 31 208 60 768 #> [113] 765 513 864 867 148 263 645 256 354 447 819 371 200 468 #> [127] 296 37 456 694 146 347 390 242 158 236 325 706 505 604 #> [141] 838 133 355 494 727 224 990 587 495 185 597 380 41 957 #> [155] 42 562 469 719 532 945 972 902 479 243 961 979 582 425 #> [169] 402 452 655 543 514 372 623 43 860 170 682 80 477 978 #> [183] 186 885 984 936 747 49 39 218 729 44 241 368 45 950 #> [197] 105 904 378 886 877 55 677 801 62 627 324 349 404 736 #> [211] 974 629 915 511 88 703 594 240 188 113 431 568 77 617 #> [225] 184 577 20 342 509 674 70 377 847 584 832 538 214 683 #> [239] 445 499 697 785 302 530 466 559 903 871 95 260 968 122 #> [253] 79 104 228 326 38 823 581 717 624 376 793 773 956 721 #> [267] 590 87 806 332 124 345 804 558 529 650 280 531 756 427 #> [281] 797 580 187 239 12 579 946 999 963 446 648 764 576 210 #> [295] 359 65 73 540 889 799 606 646 808 835 892 664 896 63 #> [309] 231 387 710 566 962 399 893 918 361 621 541 291 246 363 #> [323] 127 273 512 518 656 430 869 308 489 204 271 212 351 149 #> [337] 873 457 166 339 334 608 846 156 316 853 497 35 574 64 #> [351] 183 311 203 175 699 737 548 119 607 878 421 465 647 329 #> [365] 288 199 545 849 26 698 884 422 779 205 983 713 554 101 #> [379] 778 336 762 619 320 964 777 772 58 313 40 194 154 841 #> [393] 767 48 742 217 942 132 666 315 161 304 84 912 856 144 #> [407] 668 933 938 803 816 805 348 596 389 759 284 812 900 625 #> [421] 636 616 72 330 527 937 753 366 829 811 160 757 167 13 #> [435] 741 483 152 81 54 578 970 56 982 976 193 802 743 125 #> [449] 396 652 774 464 423 498 174 866 463 828 262 622 413 98 #> [463] 955 475 227 278 259 788 795 609 891 450 702 520 634 754 #> [477] 319 525 220 944 362 103 679 610 370 644 663 927 863 784 #> [491] 975 189 219 569 197 337 943 300 328 615 198 438 383 350 #> [505] 14 417 787 700 295 824 715 344 102 701 839 314 508 305 #> [519] 9 32 341 781 303 442 439 414 137 848 96 775 82 602 #> [533] 33 155 555 926 506 25 987 725 818 403 202 17 528 550 #> [547] 789 691 591 68 796 544 718 135 949 247 504 965 182 723 #> [561] 776 618 586 598 230 696 28 861 492 266 234 474 730 565 #> [575] 265 726 907 141 386 190 827 215 213 50 557 61 410 412 #> [589] 93 917 887 722 459 916 369 364 534 882 868 162 905 601 #> [603] 515 298 924 353 473 131 114 981 52 749 571 36 879 567 #> [617] 870 519 229 934 988 535 714 207 471 34 306 138 792 327 #> [631] 684 232 357 130 752 401 343 279 238 275 416 911 883 443 #> [645] 428 462 22 935 8 90 973 951 595 707 106 814 294 485 #> [659] 693 521 66 318 959 837 435 654 939 157 522 834 181 746 #> [673] 406 358 815 953 289 613 116 491 15 24 687 551 281 71 #> [687] 876 285 111 836 833 631 440 99 147 708 201 688 786 670 #> [701] 735 588 769 908 733 996 472 930 385 928 118 426 820 405 #> [715] 267 523 910 662 922 503 59 481 709 920 195 83 248 898 #> [729] 575 705 129 651 925 969 850 287 470 675 192 614 301 620 #> [743] 676 150 255 299 429 941 888 223 53 524 19 790 333 394 #> [757] 689 958 487 612 110 813 173 126 225 123 724 261 209 381 #> [771] 134 86 292 18 728 69 312 748 739 274 858 977 510 732 #> [785] 570 352 235 704 233 424 482 681 844 695 658 501 222 985 #> [799] 177 840 272 992 536 418 560 226 74 441 57 163 919 809 #> [813] 252 397 967 196 216 986 165 191 478 583 169 843 323 433 #> [827] 507 845 455 685 516 626 449 653 660 297 995 338 593 47 #> [841] 94 673 921 1000 830 286 27 91 270 642 496 458 276 121 #> [855] 420 76 603 734 611 671 966 637 92 807 940 321 564 906 #> [869] 393 258 142 179 365 415 408 997 971 690 400 635 947 661 #> [883] 444 89 533 881 659 488 854 237 750 448 526 592 373 176 #> [897] 486 211 800 419 632 453 140 537 563 783 367 791 356 880 #> [911] 178 143 952 120 989 821 751 960 731 923 340 331 794 322 #> [925] 374 897 108 994 128 310 561 168 855 649 589 7 454 411 #> [939] 434 346 763 745 825 309 998 517 720 771 542 460 388 573 #> [953] 290 100 817 484 172 250 740 398 432 980 822 913 895 716 #> [967] 436 493 112 392 770 206 75 831 245 859 842 857 914 151 #> [981] 585 553 67 640 810 29 21 502 991 851 391 826 633 115 #> [995] 244 277 711 572 948 546