Main navigation

Handling long-running background tasks in Drupal 7

In my previous post, I discussed how to import a large dataset into Drupal via Drush's batch API. In this blog post, I'll cover how to create background tasks in Drupal 7 that will take long amounts of time to finish.

Why would you ever want that?

If you have a task that must occur regularly, but will take a long amount of time to complete, the cron queue might be a good solution. For instance, if you have a lot of nodes that need to stay synchronized with a remote dataset, you might want to synchronize a large portion of them during a cron run, but would like your other cron tasks to complete in a timely manner.

Implement hook_cron()

hook_cron() is run everytime the drupal cron job is run. However, it is not very well suited to longer running tasks since it runs them sequentially. To avoid holding up the other cron job tasks, we'll need to create an item in the DrupalQueue.

With just those three methods you have created background tasks that will not hold up the normal cron job tasks! It is worth noting that you don't have to enter these queue items via hook_cron(), but could add them during some other time. You might create the queue items on node creation or deletion, for instance.

hello i use the same structure as you use at this sample but its not working and i dont get it why. here is the code :

function MODULENAME_cron(){$nodes=expired_nodes('type');//a function that fetch the nodes id array i want$queue= DrupalQueue::get('update_node');foreach($nodesas$row){$queue->createItem($row);}
drupal_flush_all_caches();}function MODULENAME_cron_queue_info(){$queues['update_node']=array('worker callback'=>'MODULENAME_callback','time'=>30,// time in second for each worker);return$queues;}function MODULENAME_callback($data){foreach($dataas$row){
db_insert('field_data_field_SOMENAME')->fields(array('entity_type'=>'node','bundle'=>'event','entity_id'=>$row,'revision_id'=>$row,'language'=>'und','delta'=>0,'field_other_tid'=>196,))->execute();
db_insert('field_revision_field_SOMENAME')->fields(array('entity_type'=>'node','bundle'=>'event','entity_id'=>$row,'revision_id'=>$row,'language'=>'und','delta'=>0,'field_other_tid'=>196,))->execute();}}

I used the code and I believe that flushing the caches also clears the queue. Perhaps flush the caches before adding to the queue? I was running into the same problem but it went away after I removed that particular line.