95
score %
Found us from search engine?
We rank high, you can too.
SEOPressor helps you to optimize your on-page SEO for higher & improved search ranking.
By jiathong on March 11, 2019
SEO is a rapidly changing industry with little certainty, and a whole lot of questions.
Does back link still works? Which kind of backlinks should I disavow? Should I put more images into my content? Where should the image be? Before the fold? How many h1 tag can I use?
There is a whole lot of confusion and debate on just about everything in SEO. Which is not surprising seeing how each websiteâs set up and what theyâre trying to achieve can be vastly different.
A lot of the time, thereâs no definitive answer to a question because it really depends on what you want to achieve, and what ways can better fit into your website structure.
But luckily, from time to time, weâll get a clear information from Google about what we should do in a certain situation. So here I have compiled 12 SEO tips straight from Google themselves, which you know canât go wrong. Letâs have a look, shall we?
News publishing sites that pump out a large amount of content everyday need their content to be indexed as quickly as possible since most of them are time sensitive.
John Mueller from Google confirmed in a Google Webmaster Hangout that submitting a Google News sitemap is the best way to signal Googlebot to crawl your pages. Thus the quickest way you can get your daily content indexed and searchable. If you are already familiar with creating a sitemap for your ânormalâ, non Google News content policy compliant sites, it should be pretty simple, other than a few different points.
Points to creating a news sitemap
Like every other sitemap, the Google News sitemap should be in XML format. A general sitemap should have a a size that is less than 50MB or a list web pages in less than 50,000 URLs.
However, The sitemap for news should be separated from the sitemap you already have. Unlike a general sitemap, the Google News sitemap should be even more compact, with less than 1,000 URLs included.
Any updates should be done directly in the sitemap instead of creating a new one. News URLs that are older than 2 days can be removed from the sitemap. But donât worry, as long as they are already indexed they will remain in the Google News index, and therefore searchable, for the standard 30 days.
Check out what Mueller said on the video starting from the 52 minutes mark.
There was a heated argument on Twitter over whether no-indexing the sitemap would have a negative effect on ranking.
In the midst of all that confusion, John Mueller stepped up and backed one of the SEO involved in the argument and said.
âThe sitemap itself is processed as an XML file by scripts, so a noindex wonât change anything with how we process that. “
The sitemap itself is processed as an XML file by scripts, so a noindex won't change anything with how we process that.
— 🍌 John 🍌 (@JohnMu) January 10, 2019
Before this whole fiasco happened, John has mentioned multiple times that a sitemap is not treated as a normal web page, intended to be searchable by users. Thus, no-indexing them is actually a good way to handle a sitemap.
Since this comes up from time to time — it's fine to use the x-robots-tag HTTP header with "noindex" for XML sitemap files. They don't need to be indexed to work as sitemap files, they're more like robots.txt files (made for machines) than like HTML pages (made for indexing). https://t.co/ehEcshrmxb
— 🍌 John 🍌 (@JohnMu) January 9, 2019
Another confusion that has stem from this whole fiasco is how Google had mentioned that a long term ânoindex, followâ will be treated as a ânoindex, nofollowâ.
In this situation, many are questioning that if the sitemap becomes nofollow, doesnât that mean that Googlebot will no longer access the sitemap?
But again, thatâs not the case for a XML file. Since the file is meant to be processed by scripts.
The ânoindex, followâ in long term becoming a ânoindex, nofollowâ situation is only applicable to your web pages that are intended for human visitors.
So, go ahead and no-index your sitemap, it wonât hurt your SEO and no random people from the internet will stumble on your sitemap file.
Google will drop and thus stop crawling an URL if it is consistently returning a 404 code, the URL pointing to your XML sitemap will suffer the same fate as well.
However, this is not to be confused with the URLs in the sitemap itself returning 404s. What you need to make sure that your sitemap is reachable, so you need to make sure that the URL to your sitemap file is functioning correctly.
On the other hand, in the case of a website migration, John suggests to just 301 redirect your sitemap URL to a new page. There can be other options, like 404 it in the long run, but just to make your life easier, 301 it just like how youâre 301 redirecting other pages should work just as well.
No, that's wrong. We'll drop your sitemap file if the URL OF YOUR SITEMAP FILE does not work. If /sitemap.xml is a 404, we'll stop fetching it over time.
— 🍌 John 🍌 (@JohnMu) January 17, 2019
Google is currently the only search engine, among Bing, Yahoo and Yandex, that have the ability to render and crawl your javascript web pages. However, the technology is not entirely mature yet and there are some major setbacks that set it apart from the crawling process of your ânormalâ html based web pages.
John Mueller has confirmed that sometimes the delay in Googleâs JavaScript rendering is caused by crawl budget limitations.
Google is actively working on reducing the gap between crawling pages, and rendering them with JavaScript, but it will take some time, so they recommend dynamic, hybrid or server-side rendering content for sites with a lot of content.
Hereâs what John said, âThe problem is the client side rendering also influences how quickly we can view things for crawling. So the crawl budget plus the rendering delay is something that is really hard to spot from the outside.
He further clarifies thatâIt doesn’t necessarily means it’s a rendering issue it might be a crawl budget issue.â
So even if youâre not aware of it, crawl budget issues CAN be the reason why Google is delaying on rendering your web pages.
Google recommends implementing dynamic rendering to provide a âseamlessâ process for crawlers. Thatâs a good workaround way if you implemented JavaScript features that are not supported by the crawlers on your web pages.
Since Google has made it public that their crawlers are based on Chrome 41 (M41), SEO on general can get a better grasp on the extend of their rendering capabilities. Head over here to check out what Chrome 41 supports, only partially supports and not supports at all. That could help you make a better judgement when deciding to implement or not implement certain things in your web pages.
They started discussing the topic on the 38:25 mark.
A siteâs crawl budget changes a lot over time, as Googleâs algorithms react quickly to changes made to a website. For example, if a new CMS is launched incorrectly with no caching and itâs slow, then Googlebot will likely slow down crawling over the next couple of days so that the server isnât overloaded.
Crawl budget generally remains a mysterious number to the many website owners, mainly because whenever someone shoots the question over to Google, they shrug and say âdonât worry about thatâ.
What we do know about crawl budget so far includes
Googleâs official stance on crawl budget has always been âwe donât careâ and now that itâs clear that crawl budget fluctuates to accompany the changes you made on your site, we can probably move it lower down the check-list of âSEO routines I need to work on ASAPâ.
Slimming down your website and the so-called website âspring-cleaningâ are some of the techniques used that are usually linked to optimizing crawl budget.
IMO crawl-budget is over-rated. Most sites never need to worry about this. It's an interesting topic, and if you're crawling the web or running a multi-billion-URL site, it's important, but for the average site owner less so.
— 🍌 John 🍌 (@JohnMu) May 30, 2018
Now that both Mueller and Illyes both said to not worry about it, you should probably put your attention on other more important task like making sure your website is secured (HTTPS), fixing all your broken links (401), and optimising your internal linking.
Watch the video starting the 21:55 mark.
Mueller said: “Our algorithms are really dynamic and they try to react fairly quickly to changes that you make on your site. If your site becomes slower, the Googlebot will adjust the crawling speed (thus reducing the amount of pages crawled/crawl budget) as to not overwhelm the server and vice versa. Itâs not something that is assigned one time to a website.â
John recommends placing videos fairly high up on a page to help Google understand it is a video landing page.
John starts talking about it at the 39:27 mark.
Video is awesome for sharing content in many ways. When used for advertisement, video ads has a click through rate of of 1.84%, the highest click-through rate of all digital ad formats.
A landing page is actually a pitch used to sell your products or services, using a video format in that that situation, can clearly highlighted and showcase some functional uses that canât be easily explained using words.
Some companies even offer a virtual tour of their work space using video. When you have an image imprinted in your mind, it makes an information much more memorable, thus leaving a greater impression on your prospects.
Other than using a video in a landing page, a video can also be embedded in your content page.
If your company runs a blog, try your hands on creating a couple of videos to go with your blog posts instead.
Other than acting as a more attractive and less time consuming way for visitors to absorb information, a video is also much more shareable than your 2 thousand word long blog-post.
A lot of how to queries will trigger video results, so if you work in an industry where a lot of video demos of how to questions are valid, start making those videos.
And since Google also owns YouTube, hosting your videos on YouTube then embed them on your own website is the perfect way to go.
Organization markup only needs to be on one page, this can be on the homepage or a contact page for example, but make sure it doesnât exist on all pages.
Schema markup is another SEO component that more often than not can bring a negative effect to your SEO if not executed corrected. Its complexity in implementation does not help the case as well.
Google has been actively sending out penalties for structured markups that they deemed spammy.
One way that you can make use to double check your markup before it goes live is by using Googleâs very own Structured Data Testing Tool.
Just copy and paste the relevant part of your code to run the test, or let the tool fetch it from your web page itself, in the case where the code is live.
Remember to always monitor the Structured Data Report in your Google Search Console account. Try to keep the number of errors at a minimum by always checking and updating yourself with the latest guidelines from Google.
And remember, like Mueller said, only use organization markup on one page of your website. Watch the video starting from 51:36.
Gary Illyes said on an AMA that âNo, you can abuse your internal links as much as you want AFAIK.â when asked âis there an internal linking overoptimization penalty?â
Googleâs official stance on links has been âas long as it fits in naturallyâ, which to be honest, is pretty damn vague.
Internal linking on the other hand, is actually really different from back-links or linking out to other websites. Since youâre not sharing any link juice with anyone else, youâre just building up relevance for different parts of your own website.
A good and healthy amount of internal linking is actually crucial for making your whole website more crawlable for search engine crawlers. Thatâs because a crawler relies on a link being presently available in a web page, before they can crawl it.
Imagine each links as bridges that connects a page to another page, if the bridge doesnât exist at all, that means the page is isolated and a crawler will have no way to reach it, crawl it, index it and you have no way to rank it.
Web pages that are in no way reachable via internal linkings is what we called an orphaned page.
If you want that page to rank, you donât want it to be an orphan page, and if you donât want it to be an orphan page, you need to internally link it.
Neil Patel has created what he called The Seven Commandments of Internal Linking
1. Create lots of content – so you have more relevant pages that you can link to each other.
2. Use anchor texts – anchor your links on texts are preferable over images, or you can drop the naked link there, but that would probably makes less sense to your visitors.
3. Link deep – of course you can always link your new pages at the homepage, but do you want to link ALL your new pages from your homepage? Think harder about how it would fit in your website structure and where it can be helpful, and link it there.
4. Use links that are natural to the readers – this whole blog post is me talking about advices from John Mueller and Gary Illyes, so itâs only natural that I link you guys to them right? Thatâs what natural means.
5. Use relevant links – now that Iâm talking about Google giving advices on SEO, I wouldnât link you to an article about how to make a cake, thatâs not relevant and will not bring you more value on the subject.
6. Use follow links – this page and that page is all within your own website, why nofollow them?
7. Use a reasonable number of internal links – Gary Illyes say you can abuse it as much as you want, but will that be a good experience for your reader? Be reasonable, your readers will be happy and happy readers are always a good thing.
1. If you run a publishing site, make sure to submir a Google News sitemap.
2. No-index your sitemap is totally fine.
3. Make sure there’s no 404 error code to your sitemap.
4. Crawl budget problem is among the reason why your webpage rendering and thus indexing is delayed.
5. Crawl budget fluactuates according to the changes on your website.
6. Videos placed on the higher part of the web page helps Google to better understand that it’s a video landing page.
7. Organization markup should only be present in one web page in your website.
8. There’s no internal linking over-optimization penalty.
Updated: 18 November 2024
Save thousands of dollars (itâs 100x cheaper)
Zero risk of Google penalty (itâs Google-approved)
Boost your rankings (proven by case studies)
Rank High With This Link Strategy
Precise, Simplified, Fast Internal Linking.