Flask end response and continue processing

Posted on

Question :

Flask end response and continue processing

Is there a way in Flask to send the response to the client and then continue doing some processing? I have a few book-keeping tasks which are to be done, but I don’t want to keep the client waiting.

Note that these are actually really fast things I wish to do, thus creating a new thread, or using a queue, isn’t really appropriate here. (One of these fast things is actually adding something to a job queue.)

Answer #1:

Sadly teardown callbacks do not execute after the response has been returned to the client:

import flask
import time
app = flask.Flask("after_response")

def teardown(request):

def home():
    return "Success!n"

if __name__ == "__main__":

When curling this you’ll note a 2s delay before the response displays, rather than the curl ending immediately and then a log 2s later. This is further confirmed by the logs:

teardown_request - - [25/Jun/2018 15:41:51] "GET / HTTP/1.1" 200 -

The correct way to execute after a response is returned is to use WSGI middleware that adds a hook to the close method of the response iterator. This is not quite as simple as the teardown_request decorator, but it’s still pretty straight-forward:

import traceback
from werkzeug.wsgi import ClosingIterator

class AfterResponse:
    def __init__(self, app=None):
        self.callbacks = []
        if app:

    def __call__(self, callback):
        return callback

    def init_app(self, app):
        # install extension
        app.after_response = self

        # install middleware
        app.wsgi_app = AfterResponseMiddleware(app.wsgi_app, self)

    def flush(self):
        for fn in self.callbacks:
            except Exception:

class AfterResponseMiddleware:
    def __init__(self, application, after_response_ext):
        self.application = application
        self.after_response_ext = after_response_ext

    def __call__(self, environ, start_response):
        iterator = self.application(environ, start_response)
            return ClosingIterator(iterator, [self.after_response_ext.flush])
        except Exception:
            return iterator

Which you can then use like this:

def after():

From the shell you will see the response return immediately and then 2 seconds later the after_response will hit the logs: - - [25/Jun/2018 15:41:51] "GET / HTTP/1.1" 200 -

This is a summary of a previous answer provided here.

Answered By: Matthew Story

Answer #2:

QUICK and EASY method.

We will use pythons Thread Library to acheive this.

Your API consumer has sent something to process and which is processed by my_task() function which takes 10 seconds to execute.
But the consumer of the API wants a response as soon as they hit your API which is return_status() function.

You tie the my_task to a thread and then return the quick response to the API consumer, while in the background the big process gets compelete.

Below is a simple POC.

import os
from flask import Flask,jsonify
import time
from threading import Thread

app = Flask(__name__)

def main():
    return "Welcome!"

def return_status():
    """Return first the response and tie the my_task to a thread"""
    Thread(target = my_task).start()
    return jsonify('Response asynchronosly')

def my_task():
    """Big function doing some job here I just put pandas dataframe to csv conversion"""
    import pandas as pd
    pd.DataFrame(['sameple data']).to_csv('./success.csv')
    return print('large function completed')

if __name__ == "__main__":
    app.run(host="", port=8080)
Answered By: Danish Xavier

Answer #3:

I had a similar problem with my blog. I wanted to send notification emails to those subscribed to comments when a new comment was posted, but I did not want to have the person posting the comment waiting for all the emails to be sent before he gets his response.

I used a multiprocessing.Pool for this. I started a pool of one worker (that was enough, low traffic site) and then each time I need to send an email I prepare everything in the Flask view function, but pass the final send_email call to the pool via apply_async.

Answered By: Miguel

Answer #4:

Sounds like Teardown Callbacks would support what you want. And you might want to combine it with the pattern from Per-Request After-Request Callbacks to help with organizing the code.

Answered By: Tommi Komulainen

Answer #5:

You can find an example on how to use celery from within Flask
here https://gist.github.com/jzempel/3201722

The gist of the idea (pun intended) is to define the long, book-keeping tasks as @celery.task and use apply_async1 or delay to from within the view to start the task

Answered By: PuercoPop

Answer #6:

You can do this with WSGI’s close protocol, exposed from the Werkzeug Response object’s call_on_close decorator. Explained in this other answer here: https://stackoverflow.com/a/63080968/78903

Answered By: Kiran Jonnalagadda

Leave a Reply

Your email address will not be published. Required fields are marked *