* using log directory ‘/data/gannet/ripley/R/packages/tests-devel/cvms.Rcheck’ * using R Under development (unstable) (2025-02-24 r87805) * using platform: x86_64-pc-linux-gnu * R was compiled by gcc (GCC) 14.2.1 20240912 (Red Hat 14.2.1-3) GNU Fortran (GCC) 14.2.1 20240912 (Red Hat 14.2.1-3) * running under: Fedora Linux 40 (Workstation Edition) * using session charset: UTF-8 * using option ‘--no-stop-on-test-error’ * checking for file ‘cvms/DESCRIPTION’ ... OK * this is package ‘cvms’ version ‘1.6.3’ * package encoding: UTF-8 * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for executable files ... OK * checking for hidden files and directories ... OK * checking for portable file names ... OK * checking for sufficient/correct file permissions ... OK * checking whether package ‘cvms’ can be installed ... [28s/67s] OK See 'https://www.r-project.org/nosvn/R.check/r-devel-linux-x86_64-fedora-gcc/cvms-00install.html' for details. * checking package directory ... OK * checking ‘build’ directory ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking code files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... OK * checking whether the package can be loaded with stated dependencies ... OK * checking whether the package can be unloaded cleanly ... OK * checking whether the namespace can be loaded with stated dependencies ... OK * checking whether the namespace can be unloaded cleanly ... OK * checking loading without being on the library search path ... OK * checking use of S3 registration ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... OK * checking R code for possible problems ... [44s/121s] OK * checking Rd files ... OK * checking Rd metadata ... OK * checking Rd line widths ... OK * checking Rd cross-references ... OK * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking contents of ‘data’ directory ... OK * checking data for non-ASCII characters ... OK * checking LazyData ... OK * checking data for ASCII and uncompressed saves ... OK * checking installed files from ‘inst/doc’ ... OK * checking files in ‘vignettes’ ... OK * checking examples ... OK * checking for unstated dependencies in ‘tests’ ... OK * checking tests ... [159s/210s] ERROR Running ‘testthat.R’ [159s/209s] Running the tests in ‘tests/testthat.R’ failed. Complete output: > library(testthat) > library(cvms) > > if (require("xpectr")) { + test_check("cvms") + } Loading required package: xpectr [ FAIL 17 | WARN 18 | SKIP 71 | PASS 3600 ] â•â• Skipped tests (71) â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â• • Fails in check - IMPROVE THESE TESTS (1): 'test_example_functions.R:7:3' • On CRAN (58): 'test_baseline.R:9:3', 'test_baseline.R:630:3', 'test_baseline.R:921:3', 'test_baseline.R:1218:3', 'test_baseline.R:1518:3', 'test_baseline.R:3602:3', 'test_baseline.R:3694:3', 'test_cross_validate.R:134:3', 'test_cross_validate.R:246:3', 'test_cross_validate.R:734:3', 'test_cross_validate.R:935:3', 'test_cross_validate.R:969:3', 'test_cross_validate.R:1019:3', 'test_cross_validate.R:1245:3', 'test_cross_validate.R:1518:3', 'test_cross_validate.R:1859:3', 'test_cross_validate.R:2200:3', 'test_cross_validate.R:2395:3', 'test_cross_validate.R:2511:3', 'test_cross_validate.R:3138:3', 'test_cross_validate.R:3216:3', 'test_cross_validate_fn.R:101:3', 'test_cross_validate_fn.R:512:3', 'test_cross_validate_fn.R:752:3', 'test_cross_validate_fn.R:969:3', 'test_cross_validate_fn.R:1056:3', 'test_cross_validate_fn.R:1308:3', 'test_cross_validate_fn.R:1419:3', 'test_cross_validate_fn.R:1552:3', 'test_cross_validate_fn.R:1952:3', 'test_cross_validate_fn.R:2061:3', 'test_cross_validate_fn.R:2248:3', 'test_cross_validate_fn.R:2525:3', 'test_cross_validate_fn.R:2574:3', 'test_cross_validate_fn.R:2615:3', 'test_cross_validate_fn.R:3292:3', 'test_cross_validate_fn.R:3357:3', 'test_cross_validate_fn.R:3425:3', 'test_cross_validate_fn.R:3485:3', 'test_evaluate.R:3845:3', 'test_evaluate.R:4268:3', 'test_evaluate.R:4767:3', 'test_evaluate.R:5412:3', 'test_most_challenging.R:8:3', 'test_most_challenging.R:369:3', 'test_most_challenging.R:742:3', 'test_most_challenging.R:1502:3', 'test_select_definitions.R:7:3', 'test_select_metrics.R:7:3', 'test_select_metrics.R:573:3', 'test_validate.R:123:3', 'test_validate.R:296:3', 'test_validate.R:430:3', 'test_validate.R:665:3', 'test_validate_fn.R:185:3', 'test_validate_fn.R:828:3', 'test_validate_fn.R:2195:3', 'test_validate_fn.R:3079:3' • Skipping check for CRAN release due to r_hub failure (1): 'test_combine_predictors.R:259:3' • Skipping parallel tests (3): 'test_parallelization.R:11:3', 'test_parallelization.R:56:3', 'test_parallelization.R:141:3' • Skipping test as R version is > 4.2. (1): 'test_cross_validate_fn.R:469:3' • empty test (1): 'test_parallelization.R:99:1' • keras and tensorflow take too long and have too many dependencies (1): 'test_cross_validate_fn.R:2333:3' • mac and ubuntu give different warnings (4): 'test_cross_validate.R:596:3', 'test_cross_validate.R:664:3', 'test_helpers.R:42:3', 'test_helpers.R:287:3' • tidymodels have too many dependencies (1): 'test_cross_validate_fn.R:2460:3' â•â• Failed tests â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â•â• ── Failure ('test_cross_validate.R:571:3'): gaussian mixed models with cross_validate() ── CVed$RMSE not equal to c(9.65949, 15.20226). 2/2 mismatches (average diff: NaN) [1] NaN - 9.66 == NaN [2] NaN - 15.20 == NaN ── Failure ('test_cross_validate.R:572:3'): gaussian mixed models with cross_validate() ── CVed$MAE not equal to c(7.145933, 13.577082). 2/2 mismatches (average diff: NaN) [1] NaN - 7.15 == NaN [2] NaN - 13.58 == NaN ── Failure ('test_cross_validate.R:573:3'): gaussian mixed models with cross_validate() ── CVed$r2m not equal to c(0.28219291, 0.01319592). 2/2 mismatches (average diff: NaN) [1] NA - 0.2822 == NA [2] NA - 0.0132 == NA ── Failure ('test_cross_validate.R:574:3'): gaussian mixed models with cross_validate() ── CVed$r2c not equal to c(0.804314, 0.5016056). 2/2 mismatches (average diff: NaN) [1] NA - 0.804 == NA [2] NA - 0.502 == NA ── Failure ('test_cross_validate.R:575:3'): gaussian mixed models with cross_validate() ── CVed$AIC not equal to c(175.9497, 194.6358). 2/2 mismatches (average diff: NaN) [1] NA - 176 == NA [2] NA - 195 == NA ── Failure ('test_cross_validate.R:576:3'): gaussian mixed models with cross_validate() ── CVed$AICc not equal to c(178.2523, 196.9384). 2/2 mismatches (average diff: NaN) [1] NA - 178 == NA [2] NA - 197 == NA ── Failure ('test_cross_validate.R:577:3'): gaussian mixed models with cross_validate() ── CVed$BIC not equal to c(180.3948, 199.0809). 2/2 mismatches (average diff: NaN) [1] NA - 180 == NA [2] NA - 199 == NA ── Failure ('test_cross_validate.R:580:3'): gaussian mixed models with cross_validate() ── CVed$`Convergence Warnings` not equal to c(0, 0). 2/2 mismatches (average diff: 4) [1] 4 - 0 == 4 [2] 4 - 0 == 4 ── Failure ('test_cross_validate.R:584:3'): gaussian mixed models with cross_validate() ── CVed$`Warnings and Messages`[[1]] not equal to structure(...). Attributes: < Component "row.names": Numeric: lengths (8, 0) differ > Component "Fold Column": Lengths (8, 0) differ (string compare on first 0) Component "Fold": Numeric: lengths (8, 0) differ Component "Function": Lengths (8, 0) differ (string compare on first 0) Component "Type": Lengths (8, 0) differ (string compare on first 0) Component "Message": Lengths (8, 0) differ (string compare on first 0) ── Failure ('test_validate.R:402:3'): gaussian model with validate() ─────────── Vgauss$RMSE not equal to 7.75. 1/1 mismatches [1] NaN - 7.75 == NaN ── Failure ('test_validate.R:403:3'): gaussian model with validate() ─────────── Vgauss$r2m not equal to 0.305. Modes: logical, numeric target is logical, current is numeric ── Failure ('test_validate.R:404:3'): gaussian model with validate() ─────────── Vgauss$r2c not equal to 0.749. Modes: logical, numeric target is logical, current is numeric ── Failure ('test_validate.R:405:3'): gaussian model with validate() ─────────── Vgauss$AIC not equal to 149. Modes: logical, numeric target is logical, current is numeric ── Failure ('test_validate.R:406:3'): gaussian model with validate() ─────────── Vgauss$AICc not equal to 152. Modes: logical, numeric target is logical, current is numeric ── Failure ('test_validate.R:407:3'): gaussian model with validate() ─────────── Vgauss$BIC not equal to 152.5377. Modes: logical, numeric target is logical, current is numeric ── Failure ('test_validate.R:408:3'): gaussian model with validate() ─────────── Vgauss$`Convergence Warnings` not equal to 0. 1/1 mismatches [1] 1 - 0 == 1 ── Failure ('test_validate.R:414:3'): gaussian model with validate() ─────────── as.character(Vgauss$Process[[1]]) %in% ... is not TRUE `actual`: FALSE `expected`: TRUE [ FAIL 17 | WARN 18 | SKIP 71 | PASS 3600 ] Error: Test failures Execution halted * checking for unstated dependencies in vignettes ... OK * checking package vignettes ... OK * checking re-building of vignette outputs ... [6m/13m] OK * checking PDF version of manual ... [15s/29s] OK * checking HTML version of manual ... [9s/19s] OK * checking for non-standard things in the check directory ... OK * checking for detritus in the temp directory ... OK * DONE Status: 1 ERROR