Not really a side, because it's still in lab, but I've been trying to convert my lab to use as much open software as possible. I've been re-writing a lot of old MatLab code in Python, setting up IPython notebooks, and converting or disorganized set of protocols into a git-version-controlled repository of Markdown documents. I came across this talk on a similar workflow being set up in neuroscience imaging, which got me excited: And in the process learned about one of the most futuristic research campuses I've seen: I'm hoping that when I finally go to publish a paper, the analysis will be completely NumPy / IPython / Jyupter / Docker-powered. I'm currently stuck on how to distribute an eventual 50+ GB of data without having to pay extraordinary costs for hosting...