As the developers of Open Journal Systems, Open Conference Systems, Open Harvester Systems, and Open Monograph Press, the PKP team are experts in helping journal managers and conference organizers make the most of their online publishing projects. PKP Publishing Services offers support for:

As a customer of PKP Publishing Services, you will not only receive direct, personalized support from the PKP Development Team, but will be contributing to the ongoing development of the PKP applications. All funds raised by PKP Publishing Services go directly toward enhancing our free, open source software. For more information, please contact us.

I'm writing a plugin and I need to call a function every time a submission file is uploaded.So far I managed to get it working by using "TemplateManager::display" and checking the .tpl that is being used.

However now I need to call this function also when the import is made through the native importexport plugin.Is there a way to know when a submission file is being uploaded and associated to an article?

There isn't a generic 'submission file uploaded' hook; You'll have to make sure a hook is called in all places submission files are uploaded. The Plagiarism plugin is an example of registering against the submission file upload in the 5-step submission process -- for all other places where files are uploaded (i.e. the import export plugin), you'll have to register against the hook in there (or create one if it doesn't exist).

I am doing some new work on an OJS module to process uploaded articles (wherever they are uploaded) into pdf, xml and html formats. In trying to get my first file upload hook to work, i am logged into an OJS test site as an editor and am looking at the Editing page for an article, e.g.

I can see that on this page the "editor" role actions on this page are handled by the /classes/submission/sectionEditor/SectionEditorAction.inc.php classes. I'm playing with the HookRegistry SectionEditorAction::uploadCopyeditVersion call to trigger my own plugin to do a process on the uploaded file. I can see the uploadCopyeditVersion() function is firing when I hit the Upload button in the CopyEditor section, but it doesn't seem to be making it to my callback function. I set up the callback in the plugin pretty simply as:

But a die("made it this far") planted in my fileCallback() function isn't firing, nor a log command there; it seems like its getting ignored.

So my question is, is there anything special one must do to get the HookRegistry to reset itself visa vis knowing the needs of each plugin? Is a plugin Disable / Enable cycle enough for the HookRegistry to get new info or do I have to be more extreme and delete/reinstall my plugin (during the testing cycle)?

Thanks,

Last edited by Damion on Tue Mar 19, 2013 3:10 pm, edited 1 time in total.

Plugins need to be registered in the database before you'll get hook callbacks; the tools/upgrade.php script can do this if you use it to run through the upgrade process, or alternately you can look at the existing plugin entries as templates.

One question. I'm trying to create and upload a supplemental file in response to user's uploaded article file. I need to create an article_supplemental_file record. I have a known good $fileId, good $articleId.

But although the new supplimental record is created this way, and file is associated with it, the setTypeOther isn't set, nor is the title (not in db). I see that the localizable text fields are handled by $this->updateLocaleFields($suppFile); in SuppFileDAO.inc.php . Is there something more I need to do...?...

The next challenge in this plugin project is about how to spawn a php thread that will perform lengthy tasks on the uploaded file. Since all the heavy work is to be done by this separate thread, the user will be able to resume other operations immediately after his/her file upload.

Should I stay clear of a pcntl_fork() approach? Is there a recommended way to do this in OJS?

so it could run in a php command line version like "submitDocument.php [param 1] [param 2] [param 3]

But of course the issue is that submitDocument.php needs access to a few OJS Classes: $articleDao = &DAORegistry::getDAO('ArticleDAO'); $article = &$articleDao->getArticle($articleId);and later import('classes.file.ArticleFileManager'); $articleFileManager = new ArticleFileManager($articleId);etc.

I guess one approach is to make a wget web call using the user's browser credentials to the ojs site with the action coded in the url parameters. But is there a way to just launch a separate thread server side which has easy access to the same class/method calls as the plugin code itself?

Have a look at the ProcessDAO and related classes; they're used by the citation assistant to launch longer-running back end tasks without tying up the user. It's a weird work-around but PHP doesn't offer many facilities for doing this in better ways...

Findings on ProcessDAO:- allows one to set up a request for an operation e.g. "getDocumentMarkup".- allows one to trigger a pre-set number of parallel calls to this operation, e.g. 4- an operation's call has a pre-set amount of time to execute, after that it is marked as a "zombie".- expectation is that as each individual operation call is finished it gets removed from the list/count, thus making more room for subsequent requests for an operation.- no parameters can be passed through to the operation handler so extra information has to be communicated indirectly, e.g. by having handler go look at a queue.

Seems like this is oriented towards system-wide database table driven operations. If we set up a queue of tasks, then we could trigger spawnProcesses() , and each process thread could be dedicated to the next available queue item. Subsequent calls to spawnProcesses() just ensure that the full compliment of threads are working on the given task?

Rather than get into the queue thing, is it possible to utilize something like the CliTool ? Looking at triggering getDocumentMarkup with parameters very much like tools/mergeUsers.php ?

The CliTool-based tools are intended for manual invocation from the command line, so I'm not sure they'll help you much with operational tasks; you could also look at the ScheduledTask class (and related classes), which are intended to be run periodically (i.e. daily) by cron. However, many users don't have the ability to set up cron tasks (e.g. on shared hosts) so it'll cut down considerably on the number of users who are able to run it.

The document markup task needs to be run as quickly as possible, so I did try the command line invocation rather than a scheduled approach. I was able to set it up so that the various hooks for uploading a file would catch its temporary location, then send to

with the DocumentMarkupCliTool.php script looking alot like mergeUsers.php, bringing in only the classes/environment required to do the task at hand. So that seems to be working well with the long task running until complete in the background. I guess the issue now is to make sure DocumentMarkupCliTool.php always exits and never hangs. Any other concerns about using exec this way? $articleFilePath is vetted as a straight path to temporary copy of uploaded file.

A nice side effect of this is that the details of the script's work are reported when testing directly from the commmand line. I was able to catch a few bugs that way.

There are several downsides to that approach -- you'll need to know where the PHP binary is, for example, and it's fairly *NIX-centric. It's basically equivalent to using pcntl_fork but more convoluted; if you go that route, better just to use pcntl_fork directly.

I think the reason the ProcessDAO structures were written is to accomplish the same thing but not leave it dependent on the original user's request. The ProcessDAO etc. are used to solve a similar problem to the one you're encountering: when submitting an article with a long list of citations, how do we check/parse/correlate all those citations against external services without delaying the user?