Deploying Python Flask app in Kubernetes — Beginner’s guide

In this post, we will build a simple ML app using Flask , build and push image of the app to DockerHub and then we will deploy it in kubernetes. I have tried to keep the post easy to follow and also informative. we will also be able to check the pod and service running in k8s at the end.

Dependencies & Installations

  • Flask
  • Docker
  • Dockerhub account
  • Kubernetes CLI
  • MiniKube

Five steps to deploy python application in K8s

  1. Containerize your flask App
  2. Build an image of your flask App
  3. Run locally in Docker and check
  4. Push the image to DockerHub
  5. Create configuration file for Kubernetes and deploy your application

Let’s create simple flask App to deploy in Kubernetes

Since we focus on deploying Flask app in K8s, we will keep the app as simple as possible. We will use a Logistic regression based model created on Iris dataset and we will also use a simple HTML page to give input to the model and get prediction.

Please refer to this repo to find all the files used in this blog

You can find a pickle file in the repo, a file and templates folder which is all required to run flask app locally

once you clone the repo, you can redirect to the root folder and run


You should be able to load this page on http://localhost:5000/home, provided you have installed flask on your machine

Now, you can test this app by filling the length and width values and getting the prediction. we will now focus on the 5 steps that helps us to deploy the App in K8s

1. Containerize your flask App

In order to containerize the App, we need to create a Dockerfile. A Dockerfile basically contains all the instructions that is required to run the application from command line to assemble the image. we will also need requirements.txt file which has all the dependencies to be installed to run the image.

FROM python:3.6     RUN mkdir /appWORKDIR /appADD . /app/RUN pip install -r requirements.txtEXPOSE 5000CMD ["python", "/app/"]

Here we are giving the set of instructions, Docker will use to build the image

  1. Get python 3.7 base image from DockerHub
  2. Create app directory & make it as working directory
  3. copy all the files to app directory
  4. install all the requirements for the image to run
  5. configure the port to listen & command to run the flask App

2. Build an image of your flask App

By assuming you have successfully installed Docker Desktop on your machine (refer here if you are new), we need to run docker build command to build an image out of the Dockerfile.

docker build -f Dockerfile -t iris-app:latest .

This command builds the image and we can verify it by running

docker image ls
cmd output on running — docker image ls

As you can see ,we have iris-app in the image list. let’s try to run this image using Docker

3. Run locally in Docker and check

we can check our image locally using Docker, before deploying it in Kubernetes.

docker run -p 6000:5000 iris-app

Now we should be able to load the same page (IRIS SPECIES PREDICTION) when navigating to http://localhost:6000/home.

4. Push the image to DockerHub (best practice)

As we have tested the image successfully, it’s always better to push image to DockerHub. It ensures that your image stays safe and also it can be shared with others and makes it easy to maintain. You guys can actually pull the image created for this blog from my DockerHub here.

Initially you need to create a empty repository to push your image or you can pull from mine. (My repo name is — iris-app-flask)

# Tag your image accordingly before pushing 
docker tag iris-app:latest gouthamceast/iris-app-flask:latest
# Push your image to repository
docker push gouthamceast/iris-app-flask:latest

5. Create configuration file for Kubernetes and deploy your application

Before we start deploying in Kubernetes, we need to make sure that K8s is installed properly in our machine. As beginners we can install minikube which is local kubernetes and you can install it from here. The installation is pretty straightforward and once its is done successfully, we should run the following command to start kubernetes.

minikube start

we can check by running kubectl command, if there’s no error then kubernetes is running successfully else there is a problem with minikube installation. Please revisit the documentation to get it resolved.

Create the deployment YAML file to run the App in Kubernetes, this is again set of instructions to run the App in Kubernetes.

apiVersion: v1kind: Servicemetadata:name: iris-servicespec:selector:app: iris-pythonports:- protocol: "TCP"port: 5500targetPort: 5000nodePort: 30002type: NodePort---apiVersion: apps/v1kind: Deploymentmetadata:name: iris-pythonspec:selector:matchLabels:app: iris-pythonreplicas: 1template:metadata:labels:app: iris-pythonspec:containers:- name: iris-pythonimage: gouthamceast/iris-app-flask:latestimagePullPolicy: IfNotPresentports:- containerPort: 5000

We have 2 sections in this YAML file,

  1. Creates a NodePort service, that routes the incoming traffic to the given port — 30002 (can be any port number)
  2. Creates Deployment instance of the image by pulling it from DockerHub.

We need to apply this YAML file to running kubernetes by running the following command

kubectl apply -f deployment.yml

once this command is executed, we can check the status of services, pods and nodes using,

kubectl get service
kubectl get pods

we have now deployed our flask App in kubernetes and let’s load our application.

By running kubectl get service, iris-service is running as TCP in port 5500:30002.

One last step, we need to run the following command to get the application endpoint.

minikube service flask-service

Now we can hit the tunnel endpoint to load our application which is running on Kubernetes.

And finally we have achieved our goal. The Kubernetes ecosystem has a large learning curve, let’s try to build and explore more projects to understand it better and get practical.

If there’s any issues on installation of Docker and minikube, please do comment. I’ll try to guide you as much as I can.

Senior ML engineer | NLP & CV enthusiast | MLOps Azure & AWS