First of all, i would like to thank @nvie for this wonderful opensource project. I've been playing around in combining RQ and Tornado. I'm successful on implementing RQ with Tornado.
I want to know if it is possible to use global Database connection objects for a Worker process. Let me show in code.
# tornado-init.py
class Application(tornado.web.Application):
def __init__(self):
handlers = [
(r"/", HomeHandler),
(r"/archive", ArchiveHandler),
(r"/feed", FeedHandler),
(r"/entry/([^/]+)", EntryHandler),
(r"/compose", ComposeHandler),
(r"/auth/login", AuthLoginHandler),
(r"/auth/logout", AuthLogoutHandler),
]
settings = dict(
blog_title=u"Tornado Blog",
template_path=os.path.join(os.path.dirname(__file__), "templates"),
static_path=os.path.join(os.path.dirname(__file__), "static"),
ui_modules={"Entry": EntryModule},
xsrf_cookies=True,
cookie_secret="__TODO:_GENERATE_YOUR_OWN_RANDOM_VALUE_HERE__",
login_url="/auth/login",
debug=True,
)
tornado.web.Application.__init__(self, handlers, **settings)
# Have one global connection to the blog DB across all handlers
self.db = torndb.Connection(
host=options.mysql_host, database=options.mysql_database,
user=options.mysql_user, password=options.mysql_password)
def main():
tornado.options.parse_command_line()
http_server = tornado.httpserver.HTTPServer(Application())
http_server.listen(options.port)
tornado.ioloop.IOLoop.instance().start()
if __name__ == "__main__":
main()
From http_server = tornado.httpserver.HTTPServer(Application()) we can see the Application instance created once, and is global for all handlers. The mysql db connection self.db can be used within the handlers as self.application.db. I don't know how to achieve the same in RQ Workers.
Is it possible to use global db connections when executing jobs using RQ Workers....? If so, how to use the connection object within a classmethod. I'm trying to achieve something like below...
# worker-init.py
import os
import DB import Redis,MySQL
from rq import Worker, Queue, Connection
listen = ['high', 'default', 'low']
class WorkerDB():
def __init__(self):
self.MySQL = MySQL()
self.Redis = Redis()
if __name__ == '__main__':
with Connection(Redis()):
worker = Worker(map(Queue,listen))
worker.work(WorkerDB())
# MyClass.py
class MyClass(object):
@classmethod
def mymethod(self,WorkerDB):
# Here is should be able to get the same WorkerDB object passed to Worker.work()
self.MySQL = WorkerDB.MySQL
If this cannot be done, everytime the Worker creates new connects to Redis and MySQL which becomes ineffective and can be a bottleneck. I checked it using the command redis-cli info | grep connections.
One thing i can do, use singleton to overcome this issue, but i dont this it the right way. Im eagerly seeking solution for this.
Thanks in advance. I'm looking forward that someone will reply for this issue.
First of all, i would like to thank @nvie for this wonderful opensource project. I've been playing around in combining RQ and Tornado. I'm successful on implementing RQ with Tornado.
I want to know if it is possible to use global Database connection objects for a Worker process. Let me show in code.
From
http_server = tornado.httpserver.HTTPServer(Application())we can see theApplicationinstance created once, and is global for all handlers. The mysql db connectionself.dbcan be used within the handlers asself.application.db. I don't know how to achieve the same in RQ Workers.Is it possible to use global db connections when executing jobs using RQ Workers....? If so, how to use the connection object within a classmethod. I'm trying to achieve something like below...
If this cannot be done, everytime the Worker creates new connects to Redis and MySQL which becomes ineffective and can be a bottleneck. I checked it using the command
redis-cli info | grep connections.One thing i can do, use singleton to overcome this issue, but i dont this it the right way. Im eagerly seeking solution for this.
Thanks in advance. I'm looking forward that someone will reply for this issue.