Tag: webmaster notes

iThemes Security locked out a user – how to login to WordPress admin when user is banned (SOLVED)

iThemes Security is a plugin for WordPress that makes it difficult for hackers to attack the site and collect information.

Among other features, iThemes Security has protection against brute-form paths (search for “hidden” folders and files), as well as protection against hacking user credentials by brute force passwords.

Once set up, the iThemes Security plugin usually works fine and doesn't require much attention. But sometimes there may be a problem with blocking your user, because someone tried to guess the password to your account.

The situation may arise in the following scenario:

1. You have activated the function of protecting accounts from brute-force passwords

2. The attacker repeatedly tried to guess the password from your account

3. As a result, the account was blocked

4. When you try to enter your username and password from your account to get into the WordPress administration panel, you get a message that it is blocked (banned):

YOU HAVE BEEN LOCKED OUT.
You have been locked out

You don't have to wait until the account is unlocked.

If you have access to the file system, then you can immediately log into the WordPress admin panel.

I don't know how to bypass the iThemes Security lock, instead the plan of action is the following:

1. Disable iThemes Security

2. Login to the WordPress admin area

3. Enable iThemes Security

To disable any WordPress plugin, simply remove the plugin folder. And it is not necessary to delete it – just rename it.

Open the file manager of your sites and find the following path there: SITE/wp-content/plugins/

If you are using the command line, then the path to the plugin is: SITE/wp-content/plugins/better-wp-security

Find the better-wp-security folder and rename it to something like “-better-wp-security”.

Right after that, you can log into the WordPress admin panel.

Once you are logged into the WordPress admin panel, you can reactivate the iThemes Security plugin. To do this, rename the “-better-wp-security” folder to “better-wp-security”.

All is ready! No additional iThemes Security configuration is required.

Checking the logs showed that the attack (brute-force user credentials) was carried out through the xmlrpc.php file.

The xmlrpc.php file provides features that most webmasters don't use but are actively exploited by hackers. For this reason, you can safely block access to the xmlrpc.php file. If you do not know what this file is for, then most likely you do not use it, and you can block access to it without consequences for you.

You can disable XML-RPC with an .htaccess file or a plugin.

.htaccess is a configuration file that you can create and modify.

Just paste the following code into your .htaccess file at the root of your WordPress site (the solution uses mod_rewrite):

# Block requests for WordPress xmlrpc.php file
RewriteRule ^xmlrpc\.php - [NC,F]

Your server must support .htaccess and mod_rewrite files – most hosts can do this.

WordPress error “Another update is currently in progress” (SOLVED)

When updating a WordPress site, for example, when migrating to a new version of WordPress, you may encounter an error:

Another update is currently in progress.

This problem is fairly easy to fix. It is especially pleasing that this error is not fatal, unlike, this error does not prevent users from browsing the site, and the webmaster can go to the WordPress admin area to solve the problem.

Why does the error “Another update is currently in progress” occurs?

You may see this message if the site has multiple administrators and you are trying to update WordPress at the same time. In this case, wait until another webmaster completes his job.

If you are the only administrator of the site, then the cause of this error may be a failed previous update, which was interrupted, for example, due to a broken connection.

How to fix “Another update is currently in progress” with a plugin

Due to the fact that it is possible to go to the WordPress admin panel, this error can be solved using a plugin.

The plugin is called “Fix Another Update In Progress” and can be installed through the WordPress Admin Panel.

To do this, in the admin panel, go to “Plugins” → “Add New”.

Search for “Fix Another Update In Progress”, install and activate this plugin.

Then go to “Settings” → “Fix Another Update In Progress” and click the “Fix WordPress Update Lock” button.

After that, the problem should be fixed.

How to fix “Another update is currently in progress” in phpMyAdmin

If you don't want to install the plugin, then this error can be fixed by deleting one value from the database of the WordPress site. For ease of editing the database, you can use phpMyAdmin.

Start by finding the database of the site you want to fix.

Open a table named “wp_options”.

Find the line named “core_updater.lock”. To speed up the search, you can use the search in database. Search by the “option_name” column.

Click the “Delete” button.

After that, the problem will be solved.

How to block access to my site from a specific bux site or any other site with negative traffic

There are situations when negative traffic comes from certain sites, for example, from bux sites or simply from sites that you don't like. In some cases, such traffic can be dealt with, but not always.

Quite often, there are tasks like “go to a search engine, enter such and such a query, go to such and such a site” in bux sites – this is unlikely to be combated, since this request is difficult to distinguish from ordinary traffic.

But if the request is made directly from the bux site, or is shown in an iframe, then this can be dealt with.

Also, if your site has been added to an aggregator or a link is placed on a site that you do not like, then this method will also work.

For example, a bad site is https://site.click/. To block traffic from this site, you can use the following:

RewriteCond %{HTTP_REFERER} https://site.click/ [NC]
RewriteRule .* - [R=404]

These lines need to be written to the .htaccess file. These are the rules for the mod_rewrite module, which is usually enabled in Apache.

In this case, everyone who came from the site https://site.click/ will be shown the message “404 page not found”. If desired, you can put any other response code instead of 404, for example, 403 (access denied), 500 (internal server error) or any other.

If you want to block access from multiple sites, use the [OR] flag, for example:

RewriteCond %{HTTP_REFERER} https://site.click/ [NC,OR]
RewriteCond %{HTTP_REFERER} anotherdomain\.com [NC,OR]
RewriteCond %{HTTP_REFERER} andanotherdomain\.com [NC,OR]
RewriteCond %{HTTP_REFERER} onemoredomain\.com [NC]
RewriteRule .* - [R=404]

Note that the [OR] flag does not need to be specified on the last line.

Instead of displaying an error, you can redirect to any page of your site, for example, in the following case, all users who come from the site https://site.click/ will be sent to the error.html page of your site:

RewriteCond %{HTTP_REFERER} https://site.click/ [NC]
RewriteRule .* error.html [R]

And the following rules set everyone who came from the site https://site.click/ to redirect to https://natribu.org/ru/:

RewriteCond %{HTTP_REFERER} https://site.click/ [NC]
RewriteRule .* https://natribu.org/ru/ [R]

How to protect my website from bots

In the article “How to block by Referer, User Agent, URL, query string, IP and their combinations in mod_rewrite” I showed how to block requests to a site that match several parameters at once – on the one hand, it is effective against bots, on the other – practically eliminates false positives, that is, when a regular user who is not related to bots.

It is not difficult to block bots, it is difficult to find their patterns that expose the request from the bot. There should have been another part in that article, in which I showed exactly how I assembled these patterns. I wrote it, took screenshots, but ultimately didn't add it to the article. Not because I am greedy, but I just thought that this was at odds with the topic of an article that was not the easiest one, and, in fact, very few people are interested in it.

But the day before yesterday, bots started an attack on my other site, I decided to take action against the bot… I forgot how I was collecting data)))) In general, so as not to invent commands every time, now they will be stored here)) You might find this useful too.

How to know that a site has become a target for bots

The first sign is a sharp and unreasonable increase in traffic. This was the reason to go to Yandex.Metrica statistics and check “Reports” → “Standard reports” → “Sources” → “Sources, summary”:

Yes, there is a sharp surge in direct visits, and today there are even more of them than traffic from search engines.

Let's look at Webivisor:

Short sessions from mobile devices, strange User Agents (includes very old devices), specific nature of the region/ISP. Yes, these are bots.

Identifying of IP addresses of the bots

Let's look at the command:

cat site.ru/logs/access_log | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | awk '{ print $1 }' | sort | uniq -c

In it:

  • cat site.ru/logs/access_log — read the web server log file
  • grep '"-"' — we filter requests, leaving only with an empty referrer
  • grep -E -i 'android|iPhone' — filter requests, leaving only mobile ones
  • grep -i -E -v 'google|yandex|petalbot' — remove requests from specified web crawlers
  • awk '{ print $1 }' — leave only the IP address (first field)
  • sort | uniq -c — sort and leave unique ones, display the quantity

In my opinion, everything is pretty obvious, all requests come from the same subnet 185.176.24.0/24.

But now is morning, there is still little data, let's check the log of the web server for yesterday:

zcat site.ru/logs/access_log.1 | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | awk '{ print $1 }' | sort | uniq -c

Yes, all bots came from the 185.176.24.0/24 network.

Basically, you can just block this entire subnet and end up there. But it is better to continue collecting data, then I will explain why.

Let's see which pages the bots are requesting:

cat site.ru/logs/access_log | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | grep '185.176.24' | awk '{ print $7 }' | sort | uniq -c

zcat site.ru/logs/access_log.1 | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | grep '185.176.24' | awk '{ print $7 }' | sort | uniq -c

These commands have new parts:

  • grep '185.176.24' — filter for requests from the attacker's network
  • awk '{ print $7 }' — the requested page in my server logs is the seventh column

The bot requests exactly 30 pages.

We return to the article “How to block by Referer, User Agent, URL, query string, IP and their combinations in mod_rewrite” and block the bot.

But in my case, I can get by with blocking the subnet.

In Apache 2.4:

<RequireAll>
	Require all granted
	Require not ip 185.176.24
</RequireAll>

In Apache 2.2:

Deny from 185.176.24

Keep your finger on the pulse

This is not the first influx of bots that I've been fighting, and you need to remember that the owner of bots changes the bot settings after your actions. For example, the previous time it all started with the following pattern:

  • bots requested 5 specific pages
  • all bots were with Android user agent
  • came from a specific set of mobile operator networks
  • empty referrer

After I blocked on these grounds, the owner of bots changed the behavior of the bots:

  • added URL (now 8 pages)
  • iPhone added as User-Agent
  • the number of subnets increased, but bots still came only from mobile operators

I blocked them too. After that, the bot engine added desktops to the user agents, but all other patterns remained the same, so I successfully blocked it.

After that, the bot owner did not change the behavior of the bots, and after some time (a week or two) the bots stopped trying to enter the site, I deleted the blocking rules.

For further analysis

The command for filtering requests from the specified subnet (185.176.240/24), which have a response code of 200 (that is, not blocked) – useful in case bots change the User Agent:

cat site.ru/logs/access_log | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | grep '185.176.24' | grep ' 200 ' | tail -n 10

A variant of the command for compiling a list of IP addresses given at the beginning of this article, but only requests with a response code 200 are taken into account in the command (those that we have already blocked are filtered out):

cat site.ru/logs/access_log | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | grep ' 200 ' | awk '{ print $1 }' | sort | uniq -c

Command for monitoring the latest requests specific to bots:

cat site.ru/logs/access_log | grep '"-"' | grep -E -i 'android|iPhone' | grep -i -E -v 'google|yandex|petalbot' | tail -n 10

How the influx of bots affects the site

This time, I reacted pretty quickly – a day after the attack started. But the last time bots walked around my site for a couple of weeks before I got tired of it. This did not have any impact on the position of the site in the search results.

How to block by Referer, User Agent, URL, query string, IP and their combinations in mod_rewrite

As part of the fight against the influx of bots to the site (see the screenshot above), I had to refresh my knowledge of mod_rewrite. Below are examples of mod_rewrite rules that allow you to perform certain actions (such as blocking) for users who meet a large number of criteria at once – see the most recent example to see how flexible and powerful mod_rewrite is.

See also: How to protect my website from bots

Denying access with an empty referrer (Referer)

The following rule will deny access to all requests in which the HTTP Referer header is not set (in Apache logs, "-" is written instead of the Referer line):

RewriteEngine	on
RewriteCond	%{HTTP_REFERER}	^$
RewriteRule	^.*	-	[F,L]

Blocking access on the part of the user agent

When blocking bots by User Agent, it is not necessary to specify the full name – you can specify only part of the User Agent string to match. Special characters and spaces must be escaped.

For example, the following rule will block access for all users whose User Agent string contains “Android 10”:

RewriteEngine	on
RewriteCond	%{HTTP_USER_AGENT}	"Android\ 10"
RewriteRule	^.*	-	[F,L]

Examples of User Agents blocked by this rule:

  • Mozilla/5.0 (Linux; Android 10; SM-G970F) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Mobile Safari/537.36
  • Mozilla/5.0 (Linux; Android 10; Redmi Note 7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Mobile Safari/537.36

How to block access by exact match User Agent

If you need to block access to the site by a certain User Agent with an exact match of the name, then use the If construct (this does not apply to mod_rewrite, but do not forget about this possibility):

<If "%{HTTP_USER_AGENT} == 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)'">
	Require all denied
</If>

<If "%{HTTP_USER_AGENT} == 'Mozilla/5.0 (Windows NT 10.0; WOW64; rv:45.0) Gecko/20100101 Firefox/45.0'">
	Require all denied
</If>

<If "%{HTTP_USER_AGENT} == 'Mozilla/5.0 (Windows NT 6.1; rv:45.0) Gecko/20100101 Firefox/45.9.0'">
	Require all denied
</If>

If is available since Apache 2.4.

Denying access to certain pages

The %{REQUEST_URI} variable includes everything that goes after the hostname in the request (but does not include what comes after the question mark in the URL), using it you can filter requests by URL, query string, file names or parts of them. For example:

RewriteEngine	on
RewriteCond	%{REQUEST_URI}	"query-string"
RewriteRule	^.*	-	[F,L]

Despite the fact that in the logs of the Apache web server some characters, including Cyrillic, are displayed in URL encoding, you can specify Cyrillic or other letters of national alphabets in these rules. For example, the following rule will block access to an article with the URL https://site.ru/how-to-find-which-file-from/:

RewriteEngine	on
RewriteCond	%{REQUEST_URI}	"how-to-find-which-file-from"
RewriteRule	^.*	-	[F,L]

If you wish, you can specify several URLs (or their parts) at once. Each search string must be enclosed in parentheses; the parenthesized strings must be separated by | (pipe), for example:

RewriteEngine	on
RewriteCond	%{REQUEST_URI}	"(windows-player)|(how-to-find-which-file-from)|(how much-RAM)|(how-to-open-folder-with)|(7-applications-for)"
RewriteRule	^.*	-	[F,L]

Since %{REQUEST_URI} does not include what comes after the question mark in the URL, use %{QUERY_STRING} to filter by the query string that follows the question mark.

How to filter by the query string following the question mark

The %{QUERY_STRING} variable contains the query string that follows the ? (question mark) of the current request to the server.

Note that the filtered value must be URL encoded. For example, the following rule:

RewriteCond %{QUERY_STRING} "p=5373&%D0%B7%D0%B0%D0%B1%D0%BB%D0%BE%D0%BA%D0%B8%D1%80%D0%BE%D0%B2%D0%B0%D1%82%D1%8C"
RewriteRule ^.* - [F,L]

blocks access to the page https://suay.ru/?p=5373&заблокировать, but will not deny access to the page https://suay.ru/?p=5373.

Denying IP and Ranges Access

With mod_rewrite, you can block individual IPs from accessing the site:

RewriteEngine	on
RewriteCond	"%{REMOTE_ADDR}"	"84.53.229.255"
RewriteRule	^.*	-	[F,L]

You can specify multiple IP addresses to block:

RewriteEngine	on
RewriteCond	"%{REMOTE_ADDR}"	"84.53.229.255" [OR]
RewriteCond	"%{REMOTE_ADDR}"	"123.45.67.89" [OR]
RewriteCond	"%{REMOTE_ADDR}"	"122.33.44.55"
RewriteRule	^.*	-	[F,L]

You can also use ranges, but remember that in this case, strings are treated as regular expressions, so the CIDR notation (for example, 94.25.168.0/21) is not supported.

Ranges must be specified as regular expressions – this can be done using character sets. For example, to block the following ranges

  • 94.25.168.0/21 (range 94.25.168.0 - 94.25.175.255)
  • 83.220.236.0/22 (range 83.220.236.0 - 83.220.239.255)
  • 31.173.80.0/21 (range 31.173.80.0 - 31.173.87.255)
  • 213.87.160.0/22 (range 213.87.160.0 - 213.87.163.255)
  • 178.176.72.0/21 (range 178.176.72.0 - 178.176.75.255)

the rule will work:

RewriteEngine	on
RewriteCond	"%{REMOTE_ADDR}"	"((94\.25\.1[6-7]])|(83\.220\.23[6-9])|(31\.173\.8[0-7])|(213\.87\.16[0-3])|(178\.176\.7[2-5]))"
RewriteRule	^.*	-	[F,L]

Note that the range 94.25.168.0 - 94.25.175.255 cannot be written as 94.25.1[68-75], it will be interpreted as the string “94.25.1” and a character set including character 6, range 8-7 and character 5. Due to the range of 8-7, this entry will cause an error on the server.

Therefore, to write 94.25.168.0 - 94.25.175.255, “94\.25\.1[6-7]” is used. Yes, this record does not accurately convey the original range – to increase the precision, you can complicate the regular expression. But in my case, this is a temporary hotfix, so it will do just that.

Also note that the last octet 0-255 can be skipped, since part of the IP address is enough to match the regular expression.

Combining access control rules

Task: block users who meet ALL of the following criteria at once:

1. Empty referrer

2. The user agent contains the string “Android 10”

3. Access was made to a page whose URL contains any of the strings

  • windows-player
  • how-to-find-which-file-from
  • how much-RAM
  • how-to-open-folder-with
  • 7-applications-for

4. The user has an IP address belonging to any of the ranges:

  • 94.25.168.0/21 (range 94.25.168.0 - 94.25.175.255)
  • 83.220.236.0/22 (range 83.220.236.0 - 83.220.239.255)
  • 31.173.80.0/21 (range 31.173.80.0 - 31.173.87.255)
  • 213.87.160.0/22 (range 213.87.160.0 - 213.87.163.255)
  • 178.176.72.0/21 (range 178.176.72.0 - 178.176.75.255)

The following set of rules will match the specified task:

RewriteEngine	on
RewriteCond	"%{REMOTE_ADDR}"	"((94.25.1[6-7]])|(83.220.23[6-9])|(31.173.8[0-7])|(213.87.16[0-3])|(178.176.7[2-5]))"
RewriteCond	%{HTTP_REFERER}	^$
RewriteCond	%{HTTP_USER_AGENT}	"Android\ 10"
RewriteCond	%{REQUEST_URI}	"(windows-player)|(how-to-find-which-file-from)|(how much-RAM)|(how-to-open-folder-with)|(7-applications-for)"
RewriteRule	^.*	-	[F,L]

Please note that rules that are logical OR must be collected into one large rule. That is, you cannot use the [OR] flag with any of the rules, otherwise it will break the logic of the entire rule set.

By the way, I overcame the bots.

Redirect to HTTPS not working in WordPress

This is not an obvious problem, because for some pages the redirect to HTTPS works, but for some it does not. I ran into this problem on WordPress quite by accident. Therefore, if you are a webmaster with WordPress sites, then I would recommend that you check your sites too.

Redirecting from HTTP to HTTPS is quite simple, you need to add the following lines to the .htaccess file:

RewriteEngine on
RewriteCond %{HTTPS} !on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI}

This is how I redirect to HTTPS on most of my sites.

To test the redirect to HTTPS, it is advisable to look at the HTTP response headers, since web browsers tend to open the site over HTTPS even if you explicitly specified the HTTP protocol in the URL, at least I noticed this with pages already opened over HTTPS.

In Linux, the response HTTP headers can be viewed with a command of the form (it will show both the headers and the response body):

curl -v 'URL'

And this command will show only headers:

curl -I 'URL'

If you run Windows, then you can use an online service to display HTTP headers.

We enter the site address http://site.ru/

Received HTTP redirect code:

HTTP/1.1 302 Found

We were redirected to the HTTPS version:

Location: https://site.ru/

Is everything working as it should?

We continue to check. We enter the site address http://site.ru/page-on-site

And… we get the code 200, that is, the page would be shown to us at the specified address, without redirecting to HTTPS.

This behavior can be observed on sites with beautiful (sometimes referred to as SEO) page URLs. In WordPress, this can be selected in Control Panel → Settings → Permalinks. Examples:

 Day and name	https://suay.site/2021/05/21/sample-post/
 Month and name	https://suay.site/2021/05/sample-post/
 Numeric	https://suay.site/archives/123
 Post name	https://suay.site/sample-post/

The point is that in order for any of these options to work, WordPress adds the following lines to the .htaccess file:

# BEGIN WordPress
# Директивы (строки) между `BEGIN WordPress` и `END WordPress`
# созданы автоматически и подлежат изменению только через фильтры WordPress.
# Сделанные вручную изменения между этими маркерами будут перезаписаны.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>

# END WordPress

These lines contain conditions and a rule for mod_rewrite with the [L] flag, which means that the check should be aborted according to the mod_rewrite rules. As a result, the HTTP to HTTPS redirect rule does not reach the queue.

That is, the redirect lines must be placed before the fragment that is generated by WordPress. Let's try:

Found
The document has moved here.

Additionally, a 302 Found error was encountered while trying to use an ErrorDocument to handle the request.

The situation has changed but has not improved.

It is necessary to add the [L] flag to the rewrite rule, and place these rules in the .htaccess file before the fragment from WordPress:

RewriteEngine on
RewriteCond %{HTTPS} !on
RewriteCond %{REQUEST_URI} !^/.well-known/
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L]

# BEGIN WordPress
# Директивы (строки) между `BEGIN WordPress` и `END WordPress`
# созданы автоматически и подлежат изменению только через фильтры WordPress.
# Сделанные вручную изменения между этими маркерами будут перезаписаны.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>

# END WordPress

After that, everything will work exactly as you expect. All URLs, both the Front Page and other posts, starting with http:// will be redirected to https://

By default, the code will be “302 Moved Temporarily”. If you wish, you can select the code “301 Moved Permanently”:

RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

How to find out all DNS records of sites behind CloudFlare

How to list all DNS records for a domain

Using the dig command, you can display all types of DNS records for the specified domain at once, but it does not work in all cases:

dig zalinux.ru ANY

To query all types of DNS records, but limiting the number of displayed sections only by the response section, add the “+noall +answer” options:

dig zalinux.ru ANY +noall +answer

How to list all DNS records for a domain behind CloudFlare

It works fine in most cases. But for some sites in the DNS server settings it is forbidden to display a full list of records, examples of such sites are all sites behind CloudFlare.

As a result, for example, for the site hhzunt.top (hidden behind CloudFlare), the usual method fails to get the contents of DNS records:

dig hhzunt.top ANY

You can see the answer section, apparently it is a reference to some documentation:

hhzunt.top.		3787	IN	HINFO	"RFC8482" ""

Considering that requests for single records cannot be prohibited, and also taking into account that the number of DNS records is finite, you can enumerate them all one by one:

dig hhzunt.top A +short
dig hhzunt.top AAAA +short
dig hhzunt.top SOA +short
dig hhzunt.top MX +short

etc.

You can also use the online service on the w-e-b.site or SuIP.biz website, where a new method has been added to obtain all DNS records for a specific site. The essence of the method is a full enumeration; queries are made to the DNS server for each type of record. As a result, it is now possible to get a complete list of DNS records even for sites behind CloudFlare.

Service address: https://w-e-b.site/?act=alldns

Its mirror: https://suip.biz/?act=alldns

Enter the site address, select “Enumeration” as the method.

An example of getting all DNS records for a site behind CloudFlare:

DNS records SVCB and HTTPS are displayed for all sites – regardless of whether the system administrator set them or not. Their value repeats the contents of the A record.

Permanent message “Briefly unavailable for scheduled maintenance. Check back in a minute.” (SOLVED)

When updating plugins, themes or WordPress engine, the site is automatically closed for users and instead they see the message “Briefly unavailable for scheduled maintenance. Check back in a minute”.

This is normal and the site reopens immediately after the update is complete. But if the updates were interrupted, for example, due to your unstable Internet connection, then the website unavailable message will not disappear. This article will guide you on how to fix the problem after an interrupted WordPress update.

How to update WordPress and plugins

To check available updates and update the WordPress engine, plugins and themes go to WordPress Admin Panel → Dashboard → Updates:

Or click on the two arrows in the form of a circle – the number shows the amount of available updates. If you do not see this icon, then there are no updates – all files are up to date.

How to fix “Briefly unavailable for scheduled maintenance. Check back in a minute”

If the update was interrupted – for example, you closed the page before the updates were completed, or you lost your Internet connection, or you received the ERR_NETWORK_CHANGED message in your web browser, then you can fix the problem by having access to the file system of the site.

1. First, wait a while to make sure that the updates are actually complete and that the message does not disappear automatically.

2. Go to the site folder

3. In the root directory (folder) of the site, find and delete the .maintenance file

Note that files starting with a dot are considered hidden on Linux systems. Therefore, if you have opened the correct folder, but you do not see this file, try to enable the display of hidden files in the file manager settings.

How to fix “Another update is currently in progress”

In some situations, for example, if you wait a long time after an unsuccessful update, the message that the site is unavailable for maintenance disappears and it seems that everything starts working as before, but when you try to perform the WordPress update, a message appears:

WordPress auto-update failed to complete – please try again.

Update WordPress

Another update is currently in progress.

This message will not disappear on its own. To correct this error, follow the steps above – that is, locate and delete the .maintenance file.   

What does “Programmable Search Engine revenue sharing changes start April 30th, 2021” mean?

The other day I received this letter from AdSense:

Programmable Search Engine revenue sharing changes start April 30th, 2021
Beginning April 30th, 2021, we will discontinue revenue sharing on the following search engines:
The public URL - a link provided for a Google hosted public page for your engine that hosts both the search box and the search results
The Google hosted layout, in which only the search results are displayed on a Google-hosted webpage
Search engines that fall into the two above categories will continue to show ads, but no revenue will be shared.
What does this mean for me?
To continue sharing revenue, you’ll need to use the Search Element deployed on your own site, which will continue to allow for monetization.
What should I do next?
Please review the developer site for more information.

Do you understand something? I did not understand anything, especially this text is difficult to perceive in a localized version.

The bottom line is that in the search results on your site, ads will still appear, but Google will take all the money for this ad.

“The Google hosted layout” - what is it? I suspect this is the layer on top of the site that the search results are showing, or am I wrong?

The links lead to pages with technical documentation, which does not increase the clarity of the question – does this mean that the "Search engine" blocks made in the AdSense dashboard will no longer make money for the site owners? Or does this not apply to these blocks?

Related article: Search engine ad: why nothing was found and why it doesn’t show ads

From the documentation and from the AdSense dashboard, links lead to https://cse.google.com/cse/all, sometimes to https://programmablesearchengine.google.com/cse – both show the same thing.

The string “Programmable Search Engine” from the letter and the subdomain programmablesearchengine.google.com hint that these are related things.

On the Programmable Search Engine settings tab, you can see the line “Edition - Standard with revenue sharing”.

And you can also see “Public URL” there – yeah, the first paragraph of the letter “The public URL” refers to this, that is, if you search on a page like https://cse.google.com/cse?cx=d95930401ffbc147a, then there will be advertising, but you will not be given money for it.

If you click on the “Get code” button, then the code is approximately the following:

<script async src="https://cse.google.com/cse.js?cx=d95930401ffbc147a"></script>
<div class="gcse-search"></div>

Does this apply to “The Google hosted layout, in which only the search results are displayed on a Google-hosted webpage” or not ?!

Technically, an overlay could very well be a “Google-hosted webpage”, that is, a page loaded from Google that is displayed on top of your site. Even an ordinary search block on a site, from a technical point of view, can be a page hosted on Google, which was asynchronously loaded and displayed on your site.

In general, the sent letter is a clear example of how not to make notifications, since nothing is clear from such messages anyway.

Or vice versa – this is an example of how to make a notification, if you want no one to understand anything …

After all, I managed to figure out what “Google-Hosted” is.

The search box is placed on one of your webpages. The search results are displayed on a Google-hosted webpage, which can be opened either in the same window or in a new window.

This means that the upcoming changes will not affect the Search engine blocks created in the AdSense dashboard.

So, they will no longer share the profit if:

  1. Search form and results are placed on the Google page
  2. The search form are placed on the site, and the search results on the Google page

Search engine ad: why nothing was found and why it doesn’t show ads

The ability to set up custom Google searches on your site has been available for at least a decade. Including displaying ads. But now there is a special block in AdSense called “Search engine”.

What is the profitability of search pages

Approximately 1 person out of 100 visitors to the site will search for something on it. About 1 out of 100 people looking for something will click on the ad. On my sites, search results pages bring in several times less than regular pages. That is, Programmable Search will generate some tangible income only if you have really large volumes of traffic.

But this search has other advantages as well:

  • you can set up a search for several sites at once – that is, if a user entered a query that is answered not on this one, but on your other site, he will see it in the search results
  • using search, you can promote pages (make them appear in any results), for example, with CPA or others

In any case, the site needs a search. And the Google search engine is very good, well, plus some kind of earnings.

Using it is nowhere easier – create a block, add the code to the site in the widgets and you're done! If you previously (many years ago) set up the code for search when you gave two snippets of code – one for the form and the other for the results – now you don't need it by default. By default, the results are shown directly in the widget itself.

How to create a Google site search form

In AdSense, go to Ads → Overview → By ad unit. And select “Search engine”.

On the page that opens:

  1. Enter the name.
  2. List the sites you want to search (one on each line).
  3. Enter a search string to see examples of results.
  4. Click Create button.

When entering a list of sites, the following help is given:

Specify a list of sites to search, one per line. You can add any of the following:

  • Individual pages: www.example.com/page.html
  • Entire site: www.example.com/*
  • Parts of site: www.example.com/docs/* or www.example.com/docs/
  • Entire domain: *.example.com

If you follow these tips, you might think that you must specify “*.suay.site” to enter a domain, but in fact, “suay.site” also works (that is, both options do not work, but how to fix it – see below ).

Everything is ready – copy the code and paste it into the website widget.

We check the search on the site and… nothing was found.

Why nothing was found in the Search engine ad unit

Go back to the ad review and press the edit button.

The Programmable Search Engine editor will open.

Scroll down the page that opens until you see the “Sites to search” section.

We click on each site and instead of “Include just this specific page or URL pattern I have entered”, switch to “Include all pages whose address contains this URL”.

There is no need to change the code – it remains the same.

We check - now everything works.

Why does the "Search engine" block not show ads?

To answer this question, you need to go to Setup → Ads, there you will see:

Note: in order to ensure a high quality experience for our advertisers, we are reviewing Programmable Search Engine ad traffic quality. It may take several weeks for revenue sharing to begin.

That is, there may be no advertisements for the first few weeks – nothing can be done, you have to wait.

By the way, check that “Search Engine Monetization” is enabled in the same place.

How to Change the Design of the Adsense Search Engine Ad Block

By default, search results are shown below the input form, stretching into a long “sausage”. You can change that. To do this, on the “Programmable Search Engine” edit page, go to the “Look and feel” tab and select the desired search results design.

On the Suay.site, I chose the “Overlay” option, that is, search results are shown in a large area that overlaps the page content.

When adding sites or changing the design, the ad unit code does not need to be changed.

How to promote pages through website search

On the Programmable Search Edit page, go to the Search Features tab and move the Enable promotions slider to the On position.

Add the pages you want to promote.

Loading...
X