Friday, April 8, 2016

How are you using Docker in your development workflow?

If you have been reading my blog posts rather than simply flicking them out of your news feed as a bit of noise, you will know that I have been working on a project which aims to make the deployment of Python web applications easier. I wrote a bit about this in the post titled 'Building a better user experience for deploying Python web applications’.

That post got a surprising number of reads, many more than I would normally expect to see in such a short period of time. There definitely therefore seems to be a lot of interest in the topic.

For what I am developing I am targeting a local user development workflow where you work directly on your own computer, as well as then deploying direct to some host. At the same time though, I am providing a way to ease the transition to bundling up your application as a Docker image so that it can then be run using a Docker hosting service or a more comprehensive container application platform or PaaS.

In the workflow I am creating I am allowing for the ability to also iterate on changes to your code base while it is running inside of a Docker container. This is so that where your local development system is a different operating system to where it is deployed, you can still easily debug your code for the target system.

Personally I feel that most people will still likely develop on their own local system first, rather than doing development exclusively within Docker in some way.

Although this is my view, I am very much interested in how others see Docker fitting into the development workflow when implementing Python web applications. So I would like to hear your feedback so I can factor in what people are actually doing, or what they want to be able to do, within the system I am creating.

Luckily my colleague, the awesome Steve Pousty, has recently hosted up a survey asking similar questions about use of Docker in development.

It would help me immensely in what I am working on for Python if you could respond to Steve’s survey as then I can see what I can learn from the results as well. The survey is only short and should take as little as five minutes to fill in. You can take longer of course if you want to provide additional feedback on top of the short list of multiple choice questions.

When done Steve will be collating and making available the results, so it should be an interesting set of results for anyone working in this space.

You can see Steve’s original blog post about the survey at:

* Input Request: How Do You Use Docker Containers For Your Local Development?

The survey itself you can find over on Survey Monkey at:

https://www.surveymonkey.com/r/dockerdev

If you fill in the survey, make sure you mark Python in the languages you are using so I know what responses may be extra relevant to me. I would also be interested to know what Python WSGI server you are using, or whether you are using some ASYNC Python web server. So add that as extra information at the end of the survey.

In the system I am developing, I am trying to cater for all the main WSGI servers in use (gunicorn, mod_wsgi-express, uWSGI, Waitress), as well as providing ways of also running up other servers based on the ASYNC model. Knowing what servers you are using will therefore help me understand what else I should be supporting in the workflow.

Looking forward to any comments you have. Thanks.

Thursday, April 7, 2016

Learning more about using OpenShift 3.

I still have a long list of topics I could post about here on my own blog site, but over the last couple of months or so, I have been having too much fun playing with the new version of OpenShift based on Docker and Kubernetes, and understanding everything about it. The more I dig into OpenShift, the more awesome it gets as far as being the best platform around for deploying applications in our new containerised world.

A platform alone isn’t though going to give you everything you may need to provide you with the best experience for working with a particular programming language, such as Python, but this is where I am working on my own magic secret source to make everything even easier for us Python developers. I described a bit about what I was doing around improving the deployment experience for Python web applications in a prior blog post. I am going to start ramping up soon on writing all the documentation for the packages I have been working on and hope to have more to show by the time of PyCon US.

In the interim, if you are interested in OpenShift and some of the things I have been looking at and uncovering, I have been posting over on the OpenShift blog site. The blog posts which I have posted up there over the past month and a bit are:

  • Using persistent volumes with docker as a Developer on OpenShift - This explains about how to use persistent volumes with OpenShift. This is an area where OpenShift goes beyond what is possible with hosting environments which only support 12 factor or cloud native applications. That is, not only can you host up web applications, you can run applications such as databases, which require access to persistent file system based storage
  • Using an Image Source to reduce build times - This post is actually an extension to a post I did here on my own site about improving Docker build times for Python applications. In this post I show how one can include the sped up build mechanism using a Python wheelhouse within the OpenShift build and deployment infrastructure.
  • Using a generic webhook to trigger builds - This describes how to use generic web hooks to trigger a build and deployment within OpenShift. This can easily be done when using GitHub to host your web application code, but in this case I wanted to trigger the build and deployment upon the successful completion of a test when using Travis CI, rather than straight away when code was pushed up to GitHub. This necessitated implementing a web hook proxy and I show how that was done.
  • Working with OpenShift configurations - Finally, this post provides a cheat sheet for where to quickly find information about what different configuration objects in OpenShift are all about and what settings they provide.

I will be posting about more OpenShift topics on the OpenShift blog in the future, so if you are at all interested in where the next generation of Platform as a Service (PaaS), or container application platforms, are headed, ensure you follow that blog.

If you are attending PyCon US this year in Portland Oregon, you also have the opportunity to learn more about OpenShift 3. This year I will be presenting a workshop titled 'Docker, Kubernetes, and OpenShift: Python Containers for the Real World’. This is a free workshop. All you need to do if you are already attending PyCon US, is to go back to your registration details on the PyCon US web site and go into the page for adding tutorials or workshops. You will find this workshop listed there and you can add it. Repeating again, attending the workshop will not cost you anything extra, so if you are in Portland early for PyCon US then come along. I will be talking about what OpenShift is, and how it uses Docker and Kubernetes. I will also be demonstrating the deployment of a Django based web application along with a database. This will most likely be using the Wagtail CMS if you are a fan of that. Hope to see you there.