Queued jobs are PHP objects that get serialized, persisted, transferred, unserialized, and finally executed. They may run multiple times, on multiple machines, and they may run in parallel. To design reliable queued jobs, we must consider each stage the jobs go through.
In this post, we're going to look at one of the most important things to consider when designing a queued job.
Making Jobs Self-contained
There's no way we can know for sure when a queued job is going to run. It may run instantly after being sent to the queue, and it may run after a few hours.
Since the state of the system may change between the time the job was dispatched and the time it was picked up by a worker to be processed, we need to make sure our jobs are self-contained; meaning they have everything they need to run without relying on any external system state:
DeployProject::dispatch(
$site, $site->lastCommitHash()
);
class DeployProject implements ShouldQueue
{
public function __construct(Site $site, string $commitHash)
{
$this->site = $site;
$this->commitHash = $commitHash;
}
}
In this example job, we could have extracted the last commit hash inside the handle
method of the job. However, by the time the job runs, new commits may have been sent to the site repository.
👋 This post is part of the "Laravel Queues in Action" eBook. Check it out for a crash course, a cookbook, a guide, and a reference.
If the purpose of this job was to deploy the latest commit, then extracting the last commit when the job runs would have made sense. But this job deploys the last commit that was sent when the user manually triggered the deployment.
If the user changed the color of the "Log In" button to green, deployed, and then changed it to blue. They'd expect the deployment to give them a green button.
While designing your jobs, take a look at every piece of information the job will rely on and decide if you want this data to be independent of time or not.