91
score %
Found us from search engine?
We rank high, you can too.
SEOPressor helps you to optimize your on-page SEO for higher & improved search ranking.
By ngchinann on July 22, 0201
Google will be spearheading to make Robots Exclusion Protocol a web standard
Today we're announcing that after 25 years of being a de-facto standard, we worked with Martijn Koster (@makuk66), webmasters, and other search engines to make the Robots Exclusion Protocol an official standard!https://t.co/Kcb9flvU0b
— Google Webmasters (@googlewmc) July 1, 2019
The effort was carried out by Google with Martijn Koster (the original author of the protocol), webmasters, and other search engines.
Following the announcement, Google also
1. Published a blog post discussing REP now (Read it here)
2. Updated their official developer’s document on REP (Read it here)
3. Released their robots.txt parser as open source (Read it here)
Google will drop support for unofficial Robots Exclusion Protocol directives starting September 1st
Google followed up with another blog post about unsupported rules in robots.txt here.
If you’ve been using your robots.txt to specify noindex, nofollow, or crawl-delay directive (which are all unofficial directives), you’ll have to find another way to make it work before September 1st, or things might look ugly.
That is especially true for those who are using noindex to hide away low-quality content from the search engine index. When September 1st comes, you might find a bunch of contents you intended to hide away from the SERP, being crawled and indexed, thus showing up at the result page instead.
If you’re currently using the noindex directive in your Robots.txt, Google suggested a couple of alternatives:
1. Noindex in robots meta tags
2. 404 and 410 HTTP status codes
3. Password protection
4. Disallow in robots.txt
5. Search Console Remove URL tool
This will only be affecting those who are trying to use noindex in their robots.txt file. So before you panic, be reminded that this will not be a problem if you’re using it in your HTML.
Just in case anyone is confused about the robots.txt announcement by Google yesterday, it is only a concern if you are trying to use noindex in your robots.txt file. (Most sites are not doing this.)
Noindex used in the HTML of your pages is not changing.
— Marie Haynes (@Marie_Haynes) July 2, 2019
Frédéric Dubut from Bing also chimed in and tweeted that Bing never supported any of those unofficial directives, so now is definitely the right time to fix this if you weren’t aware of this issue before.
The undocumented noindex directive never worked for @Bing so this will align behavior across the two engines. NOINDEX meta tag or HTTP header, 404/410 return codes are all fine ways to remove your content from @Bing. #SEO #TechnicalSEO https://t.co/ukKhfRPWzO
— Frédéric Dubut (@CoperniX) July 2, 2019
Now that the Robots Exclusion Protocol will be standardized, we’re positive that gray area practices such as these will be ironed out and webmasters will have an easier time controlling crawling behavior.
A number of SEOs have reported that their Google My Business listings were suspended after adding a short name to their profile.
Introduced in April, Short Names was a way of allowing businesses to create custom URLs for their Google My Business listings.
Now suspicions were raised that adding Short Names might be causing legitimate business listings to get suspended and removed from SERPs.
It appears that the rumors may be true – adding a @GoogleMyBiz shortname can cause your listing to be suspended???
I am working on a 100% legitimate business profile with 0 quality/spam issues – we just added a shortname and we are now suddenly suspended. Google, what gives? https://t.co/blKJMGpwee
— Lily Ray (@lilyraynyc) July 9, 2019
Not all businesses are getting suspended for adding short names, but it is a common theme among a series of seemingly random suspensions.
Google hasn’t confirmed if there’s a bug related to Google My Business short names, nor has it acknowledged that it’s even aware of this issue.
So all this evidence is anecdotal, and the consensus is that removing short names fixes the problem. If you’ve recently had a Google My Business listing suspended after adding a short name, your best course of action is to remove it.
An SEO tweeted about a sticky preview box in image search.
New @Google image loading split test is pretty cool. Not sure if seen before.
At first glance, the right-alignment was weird, but I’m already used to it. Loading speed is 2x as fast because no waiting for down-scrolling.@rustybrick pic.twitter.com/hCCT1dRGWO
— SEOwner (@tehseowner) July 2, 2019
Which we successfully replicated with our own search term.
That definitely makes it easier to navigate the image SERP. What about you? Let us know if you’re being served the sticky preview box or nah.
If you’re an avid user of the Chrome Developer Tools, there’s good news! Addy Osmani from the Chrome team announced that Chrome can now easier recognize the incomplete CSS properties that you’re typing. Handy if you can’t recall the full syntax for stuff like gradient, transforms, filters and more.
.@ChromeDevTools now has better autocomplete values for some CSS properties. Helpful if you can't remember the full syntax for gradients, transforms, filters etc. pic.twitter.com/LRk89gYVRP
— Addy Osmani (@addyosmani) July 5, 2019
Updated: 12 December 2024
Save thousands of dollars (it’s 100x cheaper)
Zero risk of Google penalty (it’s Google-approved)
Boost your rankings (proven by case studies)
Rank High With This Link Strategy
Precise, Simplified, Fast Internal Linking.