That’s really useful, thanks…
Our situation is that as a course team adopted notebooks back in 2014 for a course that started in 2016, if memory serves me correctly. We independently of the institution developed a Virtualbox VM that ran a notebook server and some other things and students run that on their own machine started via vagrant (there were various reasons I wanted to use vagrant, not least because it gave us a flexible way to post fixes via a new Vagrantfile if we needed to (we didn’t, but it was there as a fallback). Course numbers there are 400 or so students a year, module lasting 9 months and requiring about 4 hrs a week on notebook activities.
We also ran a course using notebooks on a FutureLearn MOOC (Learn to Code for Data Analysis) but learners there were encouraged to use Anaconda, though we also started to mention Azure Notebooks and CoCalc.
Via a collaborator in IT, we managed to get a simple Jupyterhub temporary notebook server accessed from the Moodle VLE and running on Azure Kubernetes (we’ll be posting our deployment notes at some point; 1k students on course, but as a 1-2 hour optional activity we get about 10-20% adoption(? ). We got away with this as much as anything because the institution was looking the other way at the time. Now we have a precedent, we need to keep the server running for each new presentation of the course
As far as battles go, I think I weakly tried to open too many fronts at once: I think internal JupyterHub server would be useful, but I also think Binderhub and Docker servers would be useful too. They each serve different needs, and can act as gateway drugs to each other. But all three technologies are alien to pretty much everyone so it’s hard to make a case for how they complement each other.
As you mentioned, students who’ve used the tech are our best allies and I should make more use of student feedback (positive and negative).
As far as adoption goes, I mentioned our production model briefly: two years to design and present a course for the first time and then it’s pretty much supposed to go unchanged for another five, partly because the presentation materials are handed over from the module team to the VLE folk. We’ve twisted that with our VM course because we control that, and the notebooks that students use within it. This has allowed us to tweak and fettle the VM and notebook material for each course, in part in light of student comments, in part as a result of updates to the pandas / scipy stack etc. Module teams also tend to work independently, rather than working together around a particular technology other than ones forced on us (like the VLE).
This production model is one reason there is so much internal friction; but at the same time, the nature of the notebooks could, I believe, be transformative to how we develop and deliver (interactive) materials to our remote students. As much as anything, we are a publishing house. I really should pay more attention to how O’Reilly work. (And I also need to do some demos around Jupyter Book
The research rationale for running notebook servers is another front that could be opened and I suspect the best way to that would be by running workshops for postgrad research students. (This is the only class of student we actually see face to face; we have a few hundred internal postgrad students on campus; our undergrad population is completely remote.)
Being a remote worker myself, my opportunity for daily face to face lobbying is also severely limited. If you’re on the end of a Skype call rather than in the room, it’s harder to make a point and gauge how well it’s gone down.
Re: patience… yes… I know… and: breathe… I also need to get better at drip feeding the story over a long campaign rather than: this, this, this, oh, and this, this, this… etc. (I’m pretty sure I’ve drunk too much of the big picture Kool Aid…!)