Accel Invests $57 Million in Laravel’s Open-Source Framework
- 19.09.2024
You may want to customise your file for different environments if you’re using Laravel Vapor to deploy your Laravel application. For example, you might want to allow web crawlers to index your production site but disallow them on your development and staging environments.
In this blog post, we’ll go over how to publish your Laravel Vapor configuration and include a code snippet to manage your robots.txt
files for different environments.
The first step is to publish your Laravel Vapor configuration. This will allow you to make changes to the configuration files and keep them under version control.
To publish the configuration, run the following command in your terminal:
php artisan vendor:publish --tag=vapor-config
This will create a vapor.php
file in your config
directory. You can now make changes to this file to customize your Vapor configuration.
redirect_robots_txt
OptionBefore we add the code snippet to manage our robots.txt
files for different environments, we need to modify a specific part of the vapor.php
configuration file to disable the default behaviour of serving the robots.txt
file directly.
By default, Vapor serves the robots.txt
file directly from your application’s root directory. However, when using the code snippet to manage your robots.txt
files for different environments, you need to disable this default behaviour and redirect requests for the robots.txt
file to the S3 asset bucket or CloudFront’s asset URL.
To do this, locate the following section in your vapor.php
configuration file:
/*
|--------------------------------------------------------------------------
| Redirect robots.txt
|--------------------------------------------------------------------------
|
| When this option is enabled, Vapor will redirect requests for your
| application's "robots.txt" file to the location of the S3 asset
| bucket or CloudFront's asset URL instead of serving directly.
|
*/
'redirect_robots_txt' => true,
Change the redirect_robots_txt
value from true
to false
as shown below:
'redirect_robots_txt' => false,
With this modification, requests for the robots.txt
the file will be redirected to the appropriate endpoint according to the code snippet in the next section.
robots.txt
Files for Different EnvironmentsNow that we have published our Vapor configuration and modified the redirect_robots_txt
option, we can add the code snippet to manage our robots.txt
files for different environments.
In your web.php
routes file, add the following code snippet:
Route::get('/robots.txt', function () {
match (config('app.env')) {
'production' => $robots = 'User-agent: *'."n".'Allow: /',
default => $robots = 'User-agent: *'."n".'Disallow: /',
};
return response($robots)->header('Content-Type', 'text/plain');
});
This code creates a route that returns the appropriate robots.txt
file based on the environment.
If the environment is set to production
, the code allows all user agents to access your site by returning User-agent: *
and Allow: /
.
For all other environments, the code disallows all user agents from accessing your site by returning User-agent: *
and Disallow: /
.
With this code in place and the redirect_robots_txt
option modified requests for the robots.txt the file will be redirected to the appropriate endpoint based on the endpoint accessed.