Laravel Articles
Laravel Videos
General Articles
YouTube Journey Series
7th January 2021 • Laravel
So you‘ve built the perfect website, poured hours of time and effort making it perfect.
You’ve tested it in as many browsers and on as many devices you possibly can.
It’s finally ready - it’s time to deploy your website and show the world.
Luckily, Laravel is there for you - as it always is - with Envoy.
In this article I show you how I deploy this very website using Laravel Envoy, so you can do the same with yours.
Envoy is your new friend that will log in to your server for you and run any tasks you want it to.
By creating a single blade file, you can get Envoy to deploy your site at the run of a single command.
Let’s walk through how…
Before we begin, you need to have a web server set up and ready to go. Setting up a server ready to host your production site is out of the scope of this article, I’d recommend this article by Digital Ocean to get started.
You will need to be able to log in to your server via ssh. If you can log in from your machine via the command line, Envoy will be able to do its thing.
Once you’re set up, install Envoy with the following command:
composer require laravel/envoy --dev
The Envoy file is where you tell Envoy what you want it to do.
In the root of your application, create a file named Envoy.blade.php.
The Envoy file will contain 2 parts. First, we set the server information.
@servers(['prod' => ['user@ip_address']])
This would be populated with your ssh user and ip address. Next, we can define any number of tasks. In our case well just create one called deploy.
@task('deploy', ['on' => 'prod'])
@endtask
Within our task, we list the commands we want Envoy to perform. Imagine you’ve logged in to the server yourself to deploy or update your site.
@task('deploy', ['on' => 'prod'])
cd /path/to/codingwithstef.com
git pull origin master
composer install --optimize-autoloader --no-dev
php artisan cache:clear
php artisan view:cache
php artisan optimize
@endtask
No Migrations?
This site isn't database driven, so there's no need for them to run in my case.
Here we are telling Envoy to:
composer install --optimize-autoloader --no-dev?
Adding the --optimize-autoloader and --no-dev flags optimises the autoloader for production and so is recommended for deployment.
Here's the full file:
@servers(['prod' => ['user@ip_address']])
@task('deploy', ['on' => 'prod'])
cd /path/to/codingwithstef.com
git pull origin master
composer install --optimize-autoloader --no-dev
php artisan cache:clear
php artisan view:cache
php artisan optimize
@endtask
That’s it, now we can run Envoy.
You can use the following command to run Envoy.
php vendor/bin/envoy run deploy
You will see the commands being run in your command line so you can check everything’s been done correctly and also see any errors.
That’s it - how easy is that?
For more complex set ups, you can run tasks on multiple servers, and even run them in parallel like so.
@servers(['server-1' => 'user@ip_address', 'server-2' => 'user@ip_address'])
@task('deploy', ['on' => ['server-1', 'server-2'], 'parallel' => true])
cd /path/to/codingwithstef.com
git pull origin master
composer install --optimize-autoloader --no-dev
php artisan cache:clear
php artisan view:cache
php artisan optimize
@endtask
You can even pass in arguments, for example you could tell it which branch to deploy.
@servers(['server-1' => 'user@ip_address', 'server-2' => 'user@ip_address'])
@task('deploy', ['on' => ['server-1', 'server-2'], 'parallel' => true])
cd /path/to/codingwithstef.com
git pull origin {{ $branch }}
composer install --optimize-autoloader --no-dev
php artisan cache:clear
php artisan view:cache
php artisan optimize
@endtask
php vendor/bin/envoy run deploy --branch=master
To read further, check out the Envoy documentation here.
Thanks for reading, hopefully you found this article useful. Having this in place on this site makes life so easy updating things!
As always, if you have any feedback or just want to chat about code, you can find me on Twitter @CodingWithStef, and on my YouTube channel.