Heroku delayed_job workers killed during deployment

On Heroku, I use delayed_job to run asynchronous tasks. All is well until I do a git push heroku master and then the Heroku environment kills any worker threads that are in-process.

The issue here is that those jobs never get re-queued since the delayed_job table in my db shows them as still locked and running, even though the workers that used to be servicing them are long dead.

How do I prevent this situation from occurring? I'd like Heroku to wait for all delayed jobs in progress to complete or error out before closing down, or at least terminate them and allow a new worker to be assigned to them once the server comes back up post-reboot from changes being applied by my update.

Best How To :

Looks like you can configure DJ to handle SIGTERM and mark the in-progress jobs as failed (so they'll be restarted again):

Use this setting to throw an exception on TERM signals by adding this in your initializer:

It is very common to run with SQLite in development and test and a different database like PostgreSQL in production. Rails makes it easy to do this. In your Gemfile, you can use groups to specify which gems should be installed in each environment. So, in this scenario, your gemfile...

Compile assets locally To solve this chicken and egg problem with the database in deploy, do not invoke precompile in production, precompile on development machine for production prior to push. With a functional code base pushed, you can run "$heroku db:xxx" commands on Heroku. Turn off precompile assets in production.rb:...

You can use Heroku SBT plugin https://devcenter.heroku.com/articles/deploying-scala-and-play-applications-with-the-heroku-sbt-plugin . This may be a preferred approach for applications that take a long time to compile, or that need to be deployed from a Continuous Integration server such as Travis CI or Jenkins.

You can use curl to download the backup using the public-url, e.g. curl -o appname.dump `heroku pg:backups public-url --app appname` Then you can use the Postgres pg_restore utility to restore the backup to your local database, e.g.: pg_restore --verbose --clean --no-acl --no-owner -h localhost -U appuser -d appname appname.dump (substitute...

Same thing happened with me. As far as I can remember, the reason for this is the port you used in the application. Heroku will not run your app on port 8080 or 3000, instead it will be some random port. Use this code to fix the problem. var port...

Specified 'sqlite3' for database adapter, but the gem is not loaded. Add gem 'sqlite3' to your Gemfile (and ensure its version is at the minimum required by ActiveRecord). Add gem 'sqlite3', group: :development to your Gemfile and do bundle install. And also you should be putting pg gem in...

No, this is not possible. Your app will automatically sleep when it doesn't get any requests for 30 minutes, and will be awaken if a request comes in. If you have exceeded your daily quota, the app will just not be awaken.

This doesn't work because @products isn't an array any more - it's a reference to a job that will get executed at some time in the future You need a rethink of how your setup works. For example your view could poll via Ajax to see if the job has...

Do you have PostgreSQL installed locally? If not, that might be the reason for that (having gem 'pg' is not enough to install it locally). You will have to run sudo apt-get install postgresql postgresql-contrib to install. You do not need it installed locally to push to Heroku though...as long...

If you're on a *nix host, I'd recommend running a separate, non-Rails Ruby script that is allowed to talk to the database and update a summary table containing the information you need to return to clients. There is no reason to have it run inside Rails or even to load...

Here is the answer by Bonsai support: - You could always set up a script with a curl command to index the MongoDB collections, then use cron to run the script at regular intervals. - You could also use an Elasticsearch client to index collections in some cases. So I...

DATABASE_URL is what's currently storing your second app's connection to its provisioned DB, which Heroku is kindly preventing you from deleting because there are no other references to it. First, remove the second application's DB. Anything in it will be destroyed. heroku addons:destroy heroku-postgresql:<your DB tier> --app <your second app>...

Working with Heroku support, I was not able to get this to work with the current configuration. So in the end I decided to create a database for our staging app. Imported the latest dump file from production. And ran heroku run rake db:migrate against staging. I still got the...

If you add a custom deploy script to Codeship after the Heroku deployment step, it should run after the app is running, so you'll have database access. You have access to the heroku toolkit, so you should be able to run: heroku run --app YOUR_APP_NAME -- ./node_modules/.bin/knex migrate:latest

Solved it. Put static files such as css in a separate folder such as 'mystatic' and list that folder in STATICFILES_DIRS, then run manage.py findstatic mycssfile.css to make sure it's found. Then when uploaded to Heroku collectstatic finds the file. All clear now.

ActiveJob is merely an abstraction on top of various background job processors, so many capabilities depend on which provider you're actually using. But I'll try to not depend on any backend. Typically, a job provider consists of persistence mechanism and runners. When offloading a job, you write it into persistence...

Installing git in a new dir should be the best option: If your repo is already initiated, check your remotes: git remote --v $ git remote --v heroku ssh://xxxxxxxx.xxx:repo.git (fetch) heroku ssh://xxxxxxxx.xxx:repo.git (push) Check the list you have there. The first parameter, is the remote name, then you should write...

heroku is a read only file system. so actually you don't even write the files but just keep them in memory while in one thread. if you want to use some free storage system i recommend google drive. you'll need to do some searching of how to use that since...

Puma is threaded so you need a thread safe pool of connections to PostgreSQL, otherwise concurrent requests will all use the same connection concurrently, which is unexpected. Please have a look to the connection_pool gem. It should help....

Instead of storing user-uploaded files locally, Heroku recommends putting them on an external service like Amazon S3. You may want to use an existing library for this, e.g. KnpGaufretteBundle: Easily use Gaufrette in your Symfony projects. Gaufrette itself is "a PHP5 library that provides a filesystem abstraction layer". Amazon S3...

I am assuming you are referring to the file named .env as dotenv. Every variable in that file that you need on Heroku must be defined in Heroku config vars : https://devcenter.heroku.com/articles/config-vars The .env file is an easy way to replicate the Heroku config vars locally. That way your code...

You can use a fork of the node-ftp package to do this but it isn't in NPM so requires a bit of manual install. Clone the repo locally: git clone [email protected]:choonyme/node-socksftp.git Copy the socksftp directory in to your project directory (e.g. cp -r node-socksftp/socksftp ./node_modules/ export your QuotaGuard Static connection...

I just went through this same scenario. The certificate you see in your herokuapp is the wildcard certificate issued for *.herokuapp.com. If you want to secure a custom domain name http://my-app-name.com, you would need to purchase and install your own wildcard certificate via DNSimple. ...

Postgresql comes preinstalled on Cloud9. You simply can't run bundle exec rake db:migrate though, because you have to set it up and connect to it first. Refer to the documentation here https://docs.c9.io/v1.0/docs/setting-up-postgresql on how to set it up. Also, you dont need to install postgresql to be able to deploy...

In your Gemfile group :production do gem 'pg', '0.17.1' gem 'rails_12factor', '0.0.2’ end and also remove gem 'sqlite3' OR group :development, :test do gem 'sqlite3' end Because heroku can't install the sqlite3 gem. But you can tell bundler that it shouldn't be trying to except when developing. Then run bundle...

Turns out that Codeship doesn't keep anything, in fact, different servers do the deployment than the testing. It seems that the best-practice here is to recreate the assets on the Heroku side with a custom buildpack, which, directly after the git pull, does the dependency installation and compiles the app...

This looks to be a difference in how the pg Gem and underlying libpq driver handle typing vs. the SQLite driver, stemming from a deliberate decision by the driver developers to leave type conversion to the application framework and return everything as a string. By executing raw SQL and going...

It says: run this command from an app folder. To do this you have to clone the app first. Then go into the app folder and run your command again. Or just specifiy the app via --app oldname: heroku apps:rename newname --app oldname ...

That log excert is from a one off dyno, a la heroku run console - this is entirely seperate to your web dynos which you may be runnning 2x dyno's for. You need to specifiy --size=2x in your heroku run command to have the one off process use 2x dynos.

You're using a change method rather than an up, which you should only do for the subset of migrations that rails can reverse automatically. You should change the change method to up, and add a down method that can reverse the effects of the up method....