I have an application deployed on GAE having endpoints. Each endpoint make a connection with database , get data and close connection and return data. Normally everything works fine but when there is hike in requests it starts taking more than 60 sec and requests get aborted. Due to this it does not close database connection and there mysql got 1000+ connections and then each requests starts aborting and it shows deadline exceeded error. Is there any solution for this ?
4 Answers
You could wrap the "get data" portion with a try... finally... statement and move the "close connection" portion in the finally section. Then start an "about to exceed deadline" timer before "get data" (something like say 45 seconds) and raise an exception if the timer expires, allowing you to close the connection in the finally portion, which should take care of the orphan open connections (but would not prevent errors in those requests).
If your application tolerates it you could also look into using task queues which have a 10 min deadline, which could help reducing/eliminating the errors in the requests as well.
You can also find some general advice for addressing deadline exceeded errors here: https://cloud.google.com/appengine/articles/deadlineexceedederrors, donno if applicable to your app.
EDIT: actually the suggestion in the first paragraph above doesn't work on GAE as the Python sandbox doesn't allow installing a custom signal handler:
    signal.signal(signal.SIGALRM, timer_expired)
AttributeError: 'module' object has no attribute 'signal'
After seeing your code a somehow equivalent solution would be to replace your cursor.fetchall() with a loop of cursor.fetchone() or cursor.fetchmany() to split your operation in smaller pieces:
http://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-fetchone.html. You'd get a start timestamp (with time.time() for example) when entering your request handler. Then inside the loop you'd get another timestamp to measure the time elapsed so far since the start timestamp and you'd break out of the loop and close the DB connection when deadline expiration nears. Again, this won't help with actually replying successfully to the requests if it takes so much time to prepare the replies.
- 39,470
 - 12
 - 57
 - 97
 
- 
                    how can I use timer , can you please explain ? Here is my code to connect database from python http://stackoverflow.com/questions/30913255/how-to-solve-process-terminated-because-the-request-deadline-was-exceeded-err – Sunil Garg Sep 04 '15 at 06:20
 - 
                    I had in mind using `signal.SIGALRM` for the timer, but it doesn't work as the GAE python sandbox doesn't allow overriding the signal handler: http://w3facility.org/question/does-google-app-engine-support-pythons-signal-module/. I'll update the answer. – Dan Cornilescu Sep 04 '15 at 14:44
 
You can use this solution to close connections when deadlines are exceeded:
Dealing with DeadlineExceededErrors
This way you won't have any open connections hanging there forever.
- 40,755
 - 6
 - 49
 - 58
 
- 
                    
 - 
                    1Catch DeadlineExceededError exception and close database connection before hard deadline exception is thrown. – Andrei Volgin Sep 04 '15 at 12:07
 
Think about the design of your application -
1.Use the deadline exception handling - Design smells
Because there will be situation(s) where db operation takes more than 60 seconds , If its a simple query then its well and good , but reconsider the design of the application . User Expierence is going to be hurt.
2.Lets change the design to use the endpoints-
https://cloud.google.com/appengine/docs/java/endpoints/
The way to go ,future proof.
3.Using Back-end or Task-queues as descibed in this post
- 1
 - 1
 
- 3,576
 - 8
 - 50
 - 75
 
You can set the Timeouts interactive_timeout and / or wait_timeout based on connection Type they use one of them
- 14,525
 - 2
 - 24
 - 39
 
- 
                    1These settings won't affect App Engine. It has a 60 sec. deadline for HTTP requests if using automatic scaling. – Andrei Volgin Sep 03 '15 at 15:36
 - 
                    You can set this Values global for the Database or only local per session – Bernd Buffen Sep 03 '15 at 16:00