Using Pub/Sub as a Laravel queue driver

Alex De Smet
5 min readNov 30, 2020

--

Some time ago, I created a container to run Laravel apps on Google Cloud Run. This harnessed the scalable power and ease of setup that comes with a serverless solution. The result was a responsive and cost effective service. Cloud Run spins new container instances up as needed, and down when there’s no traffic. Even all the way down to zero when your app has no visitors. Pay-as-you-use to the nearest 100 millisecond!

While this scaling is great for request-based traffic, it poses a problem for processing queued tasks. Spinning up background tasks in Cloud Run is a big no-no, so how do we design around this limitation in a Cloud Native fashion?

First, let’s take a look at how Laravel uses queues.

Laravel queues are a perfect way to offload some of those time-consuming, load intensive processing away from your main threads. By default, Laravel supports these queue drivers out-of-the-box:

  • Database
  • Redis
  • Beanstalk
  • Amazon SQS

For the first two we could use Cloud SQL as a database or Cloud Memorystore for a managed in-memory service. In both scenarios I would have to build some stateful component that polls these data sources and processes the tasks. This is of course still a valid approach, but I wanted something event-driven that scales automatically without the hassle of running a queue processing VM or something similar.

To set up such a queue processing on GCP, there are two components that fit our needs: Cloud Tasks and Pub/Sub. They have very similar functionality but I find Pub/Sub to be more flexible.

Pub/Sub processes messages that you publish on a topic and delivers them to subscribers. Push and Pull subscribers are supported and several integrations with other GCP products are already built-in. Oh, and the first 10 GB of usage per month is free you only pay for the time they’re stored.

This article will focus on how to get a Laravel Job into a Pub/Sub topic. To follow along, you will need an existing GCP project and a Laravel app to experiment with.

Prepping in the Cloud

Let’s set up the Pub/Sub components. Go to the Google Cloud Console and create a Pub/Sub topic. I will use ‘laravel-queue-1’ in the example below.

Set up authentication. To allow your Laravel app to connect and use the Pub/Sub topic, you will need a service account with the proper permissions.

Create a new account in IAM > Service Accounts. Give it the Pub/Sub Editor role.

Next, create a JSON key and download it to use in your app.

Back to Laravel

In order to use Pub/Sub as a Laravel queue driver, we need a third-party package. I will use the one provided by @kainxspirits

Install using composer

composer require kainxspirits/laravel-pubsub-queue

Great! You can now use this queue driver in your Laravel project. Insert this block in your config/queue.php

‘pubsub’ => [
‘driver’ => ‘pubsub’,
‘queue’ => env(‘PUBSUB_QUEUE’, ‘laravel-queue-1’),
‘project_id’ => env(‘PUBSUB_PROJECT_ID’, ‘insert-project-here’),
‘retries’ => 3,
‘request_timeout’ => 60,
],

Set the pubsub queue driver as default in config/queue.php (or not, and specify it in each invocation)

‘default’ => env(‘QUEUE_CONNECTION’, ‘pubsub’),

Next step is to use the authentication JSON key you exported before. By default, all Google Cloud client libraries look for two env vars:

GOOGLE_CLOUD_PROJECT_ID=[GOOGLE_PROJECT_ID]
GOOGLE_APPLICATION_CREDENTIALS=/path_to_credentials.json

To set the path variable in Linux, use the export command:

export GOOGLE_CLOUD_PROJECT_ID=[GOOGLE_PROJECT_ID]
export GOOGLE_APPLICATION_CREDENTIALS=/path_to_credentials.json

Writing the code

If you already have Jobs that are dispatched to a queue, you can skip this part.

A Laravel queue uses Jobs to push to a queue and handle the async work. As a proof-of-concept, I will create an EchoOutput Job that logs the time of invocation. Create the job using the command:

php artisan make:job EchoOutput

This will create an app/Jobs/EchoOutput.php file. In the handle we log the date:

Log::info(“Job sent at “.$this->message->format(‘Y-m-d H:i:s’));

In your code, dispatch the execution to our Pub/Sub queue. If you’re just using an empty project, the routes/web.php will do for now:

EchoOutput::dispatch(new DateTime());

To the cloud and back!

Spin up the development server on your local machine and browse to the page that populates the queue.

php artisan serve

That should have pushed the Job to Pub/Sub. In the GCP cloud console, you can see your message by creating a ‘Pull subscription’ on your topic and then use the View messages button. Don’t ‘ACK’ them for now.

Hooza, the message appears (base64 encoded)

The next step is to attach a push subscription to this Pub/Sub topic and trigger a serverless component to do the processing. For now, we will use our localhost code for testing.

To process this message, spin up a queue worker in your localhost app:

php artisan queue:work

Check the laravel.log file in storage/log/laravel.log. It should contain a similar entry:

That’s it! You have implemented a low-cost, maintenance free queue that scales up when needed without any change to your application.

By using Pub/Sub, you can also choose to offload your queue processing to a GCP Serverless component such as Cloud Functions or Cloud Run. Both support HTTP triggers to provide auto-scaling, high-performing Cloud power. I’ll save that for a future article.

Or imagine having several Pub/Sub queues, each of which triggers a different GCP component that best fits your need in terms of CPU/memory power or flexibility.

Thanks for reading! If you ever want to chat on building awesome cloud stuff, ping us at www.vbridge.eu

--

--

Alex De Smet
Alex De Smet

Written by Alex De Smet

Senior Cloud Devops Engineer @vbridgebv. Former college teacher.

Responses (2)