Hello community, I’m working with Jeremy Howard ( @jph00 ) on a blogging platform called fastpages that automatically converts Jupyter Notebooks into blog posts, and hosts them on GitHub pages.
We created this for a variety of reasons:
We wanted something open and free, where people can own their own data.
No nagwalls / paywalls for readers
Everything happens automatically via GitHub Actions once you push your notebook to a folder in your repo.
We feel that Jupyter is a natural way to write data science blogs, but it has been hard to accomplish this in a painless manner before.
We went ahead and added additional features such as:
Embedded twitter cards & youtube videos
Interactive visualizations made with Altair remain interactive in the blog post
Optional code folding
Magic commands via comments for cell hiding
Optional automatic TOC generation
Optional link badges to GitHub, and Colab
Ability to create banners such as warnings, tooltips, etc. with markdown shortcuts.
We would love anyone who is interested in this project to try it out, and give us any suggestions on additional features we could add to make it better for users. You can see a demo blog post here. Thank you for your feedback!
I saw you insert colab badges by default, which means I am contractually obliged to suggest that adding Binder badges would be a great feature The SVG for the badge is http://mybinder.org/badge_logo.svg
Hi @betatim of course! Would love to give binder the it deserves!
One question: since this is a blogging site, I can imagine folks having a diverse set of Jupyter notebooks on many different topics. For example, someone may choose to write about Altair one day and deep learning using fastai the next day. Therefore, I can imagine that these notebooks would have very different dependencies.
What is the best way to deal with this situation as my understanding is conventionally users of Binder are assumed to have a separate repo per project with dependencies defined globally at the repo level. Please correct me if I’m wrong - I’m just trying to figure out the best way to set this up and provide instructions in this scenario.
Back in the day when I used tmpnb.org to power interactive posts on my blog I put %pip install foo (or !conda install bar) at the top of my notebooks for “one time” packages and had the common ones in a requirements.txt that always got installed.
I think the %pip pattern is also what people do for colab where they can’t control what is installed (I think).
Another option is to use a (big) base environment from a different repository and pull in the notebook for the post when someone launches it. Search this forum for “binder boxes” or “prepared binder”. The idea is that someone maintains a repo that installs dependencies but has little content of its own. In a different repo (or gist) you have all the content but none of the dependencies. You combine the two with a (slightly too) complex URL (but you are auto generating it, so that should be OK) which tells BinderHub to launch the container image for the dependencies repo but put a notebook from the contents repo in it. I made a little example based on the Kaggle kernels env: https://github.com/betatim/kaggle-binder/tree/master#how-can-i-use-this
@betatim@jph00 I wonder if a good solution might be to vendor in the Docker image that Colab uses to run its notebooks? I took a look at https://github.com/googlecolab/backend-container. But it’s not clear where the Docker image is stored or defined. Does anyone know someone on the Colab team that might be able to host an image on DockerHub for the community?
This way Colab and Binder environments will always be at parity and the same !pip install etc will work the same on each.
Perhaps this already exists somewhere and I’m looking in the wrong places?
If we could put the docker image on dockerhub or a public gcr.io repository that would be great. Then you could do something like the Kaggle kernel binder I linked to above.
The blog generator could then use the colab image for mybinder.org.
Sounds sweet.
Unfortunately I don’t know anyone on the colab team personally :-/