R package compile errors when extending image

Hi

I have extended the Jupyter notebook images for some year. But now installing from R packages fails (image Quay).

Usually it worked like this in a Dockerfile

RUN Rscript -e "install.packages('units')

I’m not an R expert, but it did the job. But now I get an error during the run of configure:

checking for udunits2.h... no
checking for udunits2/udunits2.h... no
configure: error: 
--------------------------------------------------------------------------------
  Configuration failed because libudunits2.so was not found. Try installing:
    * deb: libudunits2-dev (Debian, Ubuntu, ...)
    * rpm: udunits2-devel (Fedora, EPEL, ...)

The libudunits2-dev package is installed, /usr/include/udunits2.h exists.

Looking at

using C compiler: ‘x86_64-conda-linux-gnu-cc (conda-forge gcc 14.1.0-1) 14.1.0’

is it possible that this compiler does not looks at the system path for include files?

It’s not limited to the units package, others fail too:

In file included from ncdf.c:2:
/usr/include/stdio.h:28:10: fatal error: bits/libc-header-start.h: No such file or directory
   28 | #include <bits/libc-header-start.h>

And when I’m in a notebook session, the build works without problem in R on the console or in a notebook.

Since you’re using the conda compilers, this may relate to the default $CFLAGS and similar set during environment activation. In a notebook session, the environment is properly ‘activated’, while RUN does not activate the environment.

Can you share the full Dockerfile you are using and the image you are using where the runtime installation does work?

Thanks for your reply, here is a reduced version throwing the udunits2 error described above:

FROM quay.io/jupyter/r-notebook:hub-5.1.0

USER root
RUN apt-get update && apt-get install -y \
  cmake \
  python3-pip \
  libarmadillo-dev \
  libudunits2-dev \
  libmagick++-dev \
  libglpk-dev \
  libssl-dev \
  libgdal-dev \
  libgeos-dev \
  libproj-dev \
  zip less vim man gpg povray xvfb && \
  apt-get clean

RUN Rscript -e "install.packages(pkgs=c( \
   'terra', \
   'tidyverse' ), repos=c('http://cran.r-project.org'), dependencies=TRUE, timeout=300)"

After more tests I am sure the compiler suite from Conda used in the base image is broken. I tried to set CFLAGS and to call install.packages with configure.args= to add the standard header and linker paths, to no success.

The search how to change the behaviour of R package builds led me to this post: udunits2 package installation in Docker image - Super User

Now with this before the installation statement

RUN sed -i 's/x86_64-conda-linux-gnu-cc/gcc/g' /opt/conda/lib/R/etc/Makeconf && sed -i 's/x86_64-conda-linux-gnu-c++/c++/g' /opt/conda/lib/R/etc/Makeconf

it works :partying_face:

You haven’t activated your Conda environment, so Rscript and your compilation tools may be running in an inconsistent environment.

1 Like

Right. If using use the stock images, using e.g. mamba install r-terra r-tidyverse will give consistent results, with all internally-consistent, tested binary compatibility.

Once off the happy path of what can be easily installed from conda-forge, an image may be better off building another, r-centric environment from scratch the apt way, and manually installing it as a kernel.

2 Likes