Best Practices for Google Indexing?

TLDR
As everyone around here is much more of a web expert than myself, I figured I’d ask the folks in the know, especially as I believe many of you are web developers/designers/site owners.

What are the best practices to get Google to index a brand new website?

Backstory
I launched my company’s website towards the end of January. Within a couple of weeks and with some html/SEO adjustments along the way, I was able to get Bing to index the entire website. But alas here I am months later and Google refuses to index things and instead I have a ton of “Crawled - currently not indexed” statuses.

As my website is brand new, I also only have a handful of backlinks, which from what I’ve read could very much be the problem. Bing in fact calls this out in the Webmaster Tools. If so, what’s the best way of getting backlinks as new websites are sort of in a Catch-22?

The other thing that Bing calls out is “Many pages with title too short,” which is true but intentional as I’m going for a minimalist look. For example the title of my support page is just “Support”, my contact page is just “Contact”, etc. Do folks simply pad page titles these days for indexing and SEO?

If it helps, here’s my website, Ministry of Bits.

Also, although I’m personally invested in figuring out my own woes and path forward here, I created this thread in hopes of capturing best known methods and actions for all. Looking through the forum, it looks like I’m not the only one who’s run into this so I was hoping to create a catch all thread for folks in the future.

Without getting into all the nitty gritty details. If you want Google to index and rank your website, you need to keep adding valuable content. Try for at least one page a week. Be sure it’s good information, not just filler or noise.

That said, your biggest issue is — Google knows how to read. Coming soon is short for nothing to see here, move along. So……

1 Like

Have you added your site to the Google search console? I would start there and if needed submit each page manually.

https://search.google.com/search-console/about

3 Likes

@Flashman Yes, the Google Search Console is where I’m seeing things such as “Crawled - currently not indexed” across almost my entire website.

So go the pages tab and locate the pages that are discovered but not currently indexed. Now hover on the page link to reveal the magnifier and click it. From there you can live test the url and request indexing.

This is not a 100% guarantee your pages will be indexed but in my experience it works pretty quickly 80% of the time. Last week I had a brand new domain with no backlinks indexed within 48 hours this way.

Thanks! Yup, that’s exactly what I’ve been doing. I’m seeing a bit over two weeks in cycle times for them to recrawl. :frowning: Now you see why I’m months into this.

But who knows how the universe works sometimes. I just checked again and my main homepage was indexed about 3 minutes ago. So hopefully I’ll see things improve relatively soon.

I see your robots.txt file has

User-agent: *
Allow: /
Sitemap: https://ministryofbits.com/sitemap.xml

Technically that should work and I did the same at one time, but try it like this:

Sitemap: https://ministryofbits.com/sitemap.xml

User-agent: *
Disallow:

The way I suggested is a cleaner solution and indicates everything may be indexed. The way you have it with allow/ is not part of the original robots.txt specification.

http://www.robotstxt.org/orig.html#format

Originally I started out without a robots.txt but read somewhere that having one to specify your sitemap might be helpful for search engines.

Just made the change and I appreciate the advice as every bit helps.