Tag: websites

How to reboot a server in DigitalOcean

DigitalOcean offers various cloud technologies for developers, but of all the variety of services, I use only VPS (virtual private server).

At the same time, due to the large number of cloud solutions, it is not intuitively clear how to perform such a simple action in DigitalOcean as rebooting the VPS.

The server can be restarted by connecting to it via SSH and executing the command

reboot

But if the server is frozen and SSH is not working, then you can force a reboot of the VPS from the DigitalOcean control panel.

How to restart a VPS in DigitalOcean

Go to the control panel and select Droplets from the menu, then select the server you want to restart.

Then select the “Power” tab.

On it you will see two possible actions:

  • Turn off Droplet – turn off the VPS
  • Power cycle – VPS reboot

That is, to restart the VPS, you need to press the “Power Cycle” button.

After that, wait until the reboot is completed.

What is the difference between Turn off Droplet and Power cycle

Turn off Droplet roughly corresponds to a power outage from your virtual private server. That is, the server stops working. At the same time, the server itself is saved, its IP and IPv6 addresses and other characteristics are saved.

You can turn your server back on at any time.

The “Turn off Droplet” feature is necessary if you want to make it unavailable, shut down your server for any reason.

Please note that turning off the server using the “Turn off Droplet” does not cancel the payment for it! You will still be charged for the plan for VPS in accordance with the selected server parameters!

If you want the server to no longer be charged, then you need to destroy it (that is, go to the “Destroy” section and perform the appropriate actions to delete the server). After deleting the server, it will be impossible to restore it!

The “Power cycle” is to turn off the power from the server, and then turn it on again, that is, this corresponds to a VPS reboot.

That is, you can restart the server by turning off the power in the Turn off Droplet, and then turning it back on manually, or by performing a Power cycle. However, these actions are not quite equivalent:

  • Turning off and on using the “Turn off Droplet” will mean an attempt to send a command to the server to turn off. If this fails, the power will be forcibly turned off. You can then restart the server manually. All of this takes longer overall, but is a bit safer.
  • Using “Power cycle” will turn the power off and back on without attempting to shut down the server safely. Power cycle takes less time (provided that your VPS is healthy and able to boot on its own).

DigitalOcean promo code

If you want to get a DigitalOcean promo code for testing VPS (or other cloud features) for free, then use this link.

You will be given $200, which you can use to create a VPS, among other things.

Sitemap.xml files: what they are for, how to use them, and how to bypass “Too many URLs” error and size limits

Table of contents

  1. What are Sitemaps
  2. What are the restrictions for sitemap files
  3. How can you compress a sitemap file
  4. Can I use multiple sitemaps?
  5. What is the structure of sitemap files
  6. How to generate sitemap files
  7. How to Import a Sitemap into Google Search Console
  8. Sitemap.xml file status “Couldn't fetch”
  9. Is it necessary to use the sitemap.xml file?
  10. What to do if the sitemap contains an error. How to remove a sitemap file from Google Search Console

What are Sitemaps

Sitemaps are XML-formatted files that contain a list of the URLs of your site's pages for submission to the Google search engine so that it can quickly find out and index them.

What are the restrictions for sitemap files

  1. The file size should not be more than 50 MB
  2. There can be no more than 50,000 links in one file

How can you compress a sitemap file

In addition to the simple text format with XML markup, the file can be compressed into a .gz archive. In this case, the file size decreases dramatically because text files compress very well. For example, my 25 MB file was compressed into a 500 KB file.

To do this, it is enough to compress the original sitemap.xml file into .gz format. As a link in Google Search Console, you need to specify the path to the archive, for example: https://site.net/sitemap.xml.gz

If, when you try to open the https://site.net/sitemap.xml.gz file in a web browser, it downloads it to your computer instead of showing the content as for the sitemap.xml file, then this is normal. Either way, Google Search Console will be able to process this file.

Can I use multiple sitemaps?

For each site or domain resource, you can create multiple Sitemaps and import them all into Google Search Console – this is not only allowed, but also recommended by Google itself for sitemaps that are too large.

If there are many Sitemap files, then a complete list of them can be collected in a separate Sitemap file. This file is called “Sitemap Index File”. An example of the content of the sitemap.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
	<sitemap>
		<loc>https://site.net/sitemaps/sitemap_1.xml</loc>
	</sitemap>
	<sitemap>
		<loc>https://site.net/sitemaps/sitemap_2.xml</loc>
	</sitemap>
	<sitemap>
		<loc>https://site.net/sitemaps/sitemap_3.xml</loc>
	</sitemap>
</sitemapindex>

After that, it is enough to import this main file into Google Search Console.

The rest of the sitemaps listed in the main index file will automatically be imported into the Google Search Console.

To see them, click on the file name. You will see a list of imported Sitemaps.

You need to wait before these files are processed and their status changes to “Success”.

What is the structure of sitemap files

Sitemap files have the following structure:

<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
	<url>
		<loc>https://domain.site.net/?p=1</loc>
		<lastmod>2022-10-08T14:14:27+00:00</lastmod>
		<changefreq>monthly</changefreq>
		<priority>0.8</priority>
	</url>
<url>
	<loc>https://domain.site.net/?p=2</loc>
		<lastmod>2022-10-08T14:14:27+00:00</lastmod>
		<changefreq>monthly</changefreq>
		<priority>0.8</priority>
	</url>
	<url>
		<loc>https://domain.site.net/?p=3</loc>
		<lastmod>2022-10-08T14:14:27+00:00</lastmod>
		<changefreq>monthly</changefreq>
		<priority>0.8</priority>
	</url>
</urlset>

Each entry consists of four elements:

  1. URL
  2. Date of last modification
  3. Frequency of modification (e.g. monthly)
  4. A priority

How to generate sitemap files

If you are using WordPress, then the easiest way is to install a sitemap plugin.

If there is no sitemap plugin for your site engine, then it is quite easy to generate it yourself, since it is just a text file with XML markup.

How to Import a Sitemap into Google Search Console

Go to Google Search Console, select the site you want to report the Sitemap for, enter the URL of the Sitemap.

Sitemap.xml file status “Couldn't fetch”

At first, an inscription may appear that the sitemap.xml file “Couldn't fetch”. This inscription appears even if everything is alright with the sitemap.xml file. You just need to wait a little.

The bottom line is that this inscription does not mean that there are problems with the sitemap.xml file. It's just that the turn to analyze this file has not yet come.

A little later, the status of the file will change to “Successful”. At the same time, it will show how many URLs were revealed thanks to this file.

Even later, you can view the link indexing report from the sitemap.xml file.

Is it necessary to use the sitemap.xml file?

In fact, I don't usually use a sitemap.xml file. I add articles to most sites manually and, in my opinion, the sitemap.xml file is not particularly needed, since pages on such sites are indexed very quickly.

But if you're unhappy with your site's indexing speed, or need to quickly report a large number of URLs to be indexed, then try using sitemap.xml files.

What to do if the sitemap contains an error. How to remove a sitemap file from Google Search Console

If, after trying to process the Sitemap, you find that it contains errors (for example, an incorrect date format or broken links), then you do not have to wait until the time comes for the next crawling.

You can delete a Sitemap from Google Search Console and add it again right away. After that, quite quickly (within a few minutes), Google will check the Sitemap file again.

To remove a Sitemap file from Google Search, click on it. On the page that opens, in the upper right corner, find the button with three horizontal dots. Click it and select “Remove sitemap”.

After that, the Sitemap file will be deleted and you, after correcting errors in it, can immediately re-add the Sitemap file with the same or a different URL.

How to prevent Tor users from viewing or commenting on a WordPress site

The Tor network is an important tool for anonymity, privacy, and censorship circumvention, which in some countries is being fought even at the state level.

But Tor is a public tool, so it can sometimes be used for online trolling and bullying. This article will show you how:

  • prevent Tor users from commenting on your WordPress site
  • prevent Tor users from registering and logging into the site
  • prevent Tor users from viewing WordPress site

WordPress plugin to control allowed actions from the Tor network

VigilanTor is a free WordPress plugin that can block comments, browsing, and registration for Tor users.

This plugin automatically updates the list of IP addresses of the Tor network and, after configuration, automatically controls and blocks Tor users.

To install VigilanTor, go to WordPress Admin Panel → Plugins → Add New.

Search for “VigilanTor”, install and activate it.

Then go to Settings →VigilanTor Settings.

We will perform all subsequent actions on the plugin settings page.

How to disable commenting on a site from Tor

Enable two settings:

  • Block Tor users from commenting (prevent Tor users from commenting your WordPress site)
  • Hide comment form from Tor users

Now Tor users will still be able to view your site, but when they try to leave a comment, they will receive a message:

Error: You appear to be commenting from a Tor IP address which is not allowed.

How to prevent Tor users from registering and logging into the site

To prevent Tor users from registering on a WordPress site and preventing registered users from logging in from the Tor network, enable the following settings:

  • Block Tor users from registering
  • Flag users who signed up using Tor
  • Block Tor users from logging in (Useful for preventing brute for attacks)

How to Block Tor Users from Viewing a WordPress Site

Enable setting:

  • Block Tor users from all of WordPress

This setting will prevent any activity, including logging into the site, commenting, and browsing.

When trying to open a site in Tor, the user will receive a message:

Sorry, you cannot access this website using Tor.

How often does VigilanTor update the list of Tor IP addresses

The Tor network often changes IP addresses, that is, new ones are added, and old ones are removed. Once downloaded, the Tor network IP list becomes obsolete over time.

VigilanTor automatically downloads the list of Tor IP addresses and updates it automatically.

By default, the update is performed every 10 minutes. You can increase this interval to 6 hours, or enable real-time updates.

How to prevent search engines from indexing only the main page of the site

To prevent search engines from indexing only the main page, while allowing indexing of all other pages, you can use several approaches, depending on the characteristics of a particular site.

1. Using the robots.txt file

If the main page has its own address (usually it is index.php, index.html, index.htm, main.html and so on), and while trying to open a link like w-e-b.site/ a website redirects to the main page, for example, to w-e-b.site/index.htm, then you can use the robots.txt file with something like the following content:

User-agent: *
Disallow: /index.php
Disallow: /index.html
Disallow: /index.htm
Disallow: /main.html

In fact, using an explicit name for the main page is the exception rather than the rule. So let's look at other options.

You can use the following approach:

  1. Deny site-wide access with the “Disallow” directive.
  2. Then allow the indexing of the entire site using the “Allow” directive, except for the main page.

Sample robots.txt file:

User-agent: *
Allow: ?p=
Disallow: /

The “Allow” directive must always come before “Disallow”. The “Allow” directive allows all pages with a URL like “?p=”, and the “Disallow” directive disables all pages. As a result, the following result is obtained: indexing of the entire site (including the main page) is prohibited, except for pages with an address like “?p=”.

Let's look at the result of checking two URLs:

  • https://suay.ru/ (main page) – indexing is prohibited
  • https://suay.ru/?p=790#6 (article page) – indexing allowed

In the screenshot, number 1 marks the contents of the robots.txt file, number 2 is the URL being checked, and number 3 is the result of the check.

2. Using the robots meta tag

If your site is separate files, then add the robots meta tag to the HTML code of the main page file:

<meta name="robots" content="noindex,nofollow>

3. With .htaccess and mod_rewrite

Using .htaccess and mod_rewrite, you can block access to a specific file as follows:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Google [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Yandex [NC]
RewriteRule (index.php)|(index.htm)|(index.html) - [F]

Please note that when you try to open a link like https://w-e-b.site/ (that is, without specifying the name of the main page), a specific file is still requested on the web server side, for example, index.php, index.htm or index. html. Therefore, this method of blocking access (and, accordingly, indexing) works even if the main page of your site opens without specifying a specific file name (index.php, index.html, index.htm, main.html, and so on), as is usually the case.

iThemes Security locked out a user – how to login to WordPress admin when user is banned (SOLVED)

iThemes Security is a plugin for WordPress that makes it difficult for hackers to attack the site and collect information.

Among other features, iThemes Security has protection against brute-form paths (search for “hidden” folders and files), as well as protection against hacking user credentials by brute force passwords.

Once set up, the iThemes Security plugin usually works fine and doesn't require much attention. But sometimes there may be a problem with blocking your user, because someone tried to guess the password to your account.

The situation may arise in the following scenario:

1. You have activated the function of protecting accounts from brute-force passwords

2. The attacker repeatedly tried to guess the password from your account

3. As a result, the account was blocked

4. When you try to enter your username and password from your account to get into the WordPress administration panel, you get a message that it is blocked (banned):

YOU HAVE BEEN LOCKED OUT.
You have been locked out

You don't have to wait until the account is unlocked.

If you have access to the file system, then you can immediately log into the WordPress admin panel.

I don't know how to bypass the iThemes Security lock, instead the plan of action is the following:

1. Disable iThemes Security

2. Login to the WordPress admin area

3. Enable iThemes Security

To disable any WordPress plugin, simply remove the plugin folder. And it is not necessary to delete it – just rename it.

Open the file manager of your sites and find the following path there: SITE/wp-content/plugins/

If you are using the command line, then the path to the plugin is: SITE/wp-content/plugins/better-wp-security

Find the better-wp-security folder and rename it to something like “-better-wp-security”.

Right after that, you can log into the WordPress admin panel.

Once you are logged into the WordPress admin panel, you can reactivate the iThemes Security plugin. To do this, rename the “-better-wp-security” folder to “better-wp-security”.

All is ready! No additional iThemes Security configuration is required.

Checking the logs showed that the attack (brute-force user credentials) was carried out through the xmlrpc.php file.

The xmlrpc.php file provides features that most webmasters don't use but are actively exploited by hackers. For this reason, you can safely block access to the xmlrpc.php file. If you do not know what this file is for, then most likely you do not use it, and you can block access to it without consequences for you.

You can disable XML-RPC with an .htaccess file or a plugin.

.htaccess is a configuration file that you can create and modify.

Just paste the following code into your .htaccess file at the root of your WordPress site (the solution uses mod_rewrite):

# Block requests for WordPress xmlrpc.php file
RewriteRule ^xmlrpc\.php - [NC,F]

Your server must support .htaccess and mod_rewrite files – most hosts can do this.

Loading...
X