by ThinkCode | Aug 18, 2015 | Search Engine Optimization
In the past, the meta data that you included in the code of our website played a large part in your search ranking. The search engine spiders used this information to determine what your website was about and then ranked your site accordingly.
Everyone saying they are important is right on the money! The reason they are so important is because search engines read the pages top to bottom, left to right and score terms and phrases towards the top “more important” than terms and phrases at the bottom, so given the fact that all that meta data is at the top of the page it is by default “weighted heavy”. Now the main trick is to make sure you use those same keywords and phrases ON the page, in the text, if you don’t then the meta data is basically useless and can be disregarded by the search engines.
In other words, you can’t just stuff terms in meta data and put a nice image on the page with no text and expect to rank, it simply does not work that way. You must build your pages first, create a keyword rich title and meta data based off the page you built, pulling the terms and phrases right from the content of the page to ensure it matches. This will lead to good ranking providing other factors are in place and that the competition isn’t over 100,000,000 or something crazy like that. That is where the real tricky part comes in, knowing what terms to even go after in the first place! That, however, is a totally different discussion.
by ThinkCode | Jun 24, 2015 | Search Engine Optimization
There really are countless “keyword tracking tools” in the market, but few are accurate. Infact most of them aren’t. You can really only use them as a general guide, because as you know, rankings fluctuate daily, you have personalized results, locale targeted results and so on.
- Rank Checker by SEO Book. This is a browser extension and allows you track keywords from your desktop. (This tool is usually very accurate) The only problem with this tool however is that Google will block you if you use it too heavily. We can track 10 primary keywords per client just to get a feel for how they’re performing at a quick glance.
- SEM Rush. This software allows you to see exactly
a) what keywords a site is performing for and
b) the URL tied in with that keyword.
This can be VERY useful for finding content marketing opportunities (reverse engineering). The only downside to this tool we have found so far is that
a) it doesn’t produce fancy looking keyword positioning reports (it just spits out an excel spreadsheet)
b) you can only set reporting to weekly (oddly enough) Outside of that, its my GO TO tool for all things keywords.
by ThinkCode | Jun 9, 2015 | Search Engine Optimization
Links have always been at the heart and soul of the web when the web began. Google says link building will become less and less important as time goes on however.
With that in mind, it IS back links which get a site more Rankings in Google as back links work as a ‘Vote of confidence’ that you are REAL and your site is of value enough that others are adding a link BACK to your site because they too consider it to be so.
A sites RANKINGS however really do nothing (and never have) with where your site displays in a search result. Search results use a different set of rules and display based on your sites Content, meta tags, proper use of CODING, page speed, mobile readiness and many other factors which have nothing at all to do with how many back links your site may have.
SO then what we have is back links are still important at this point but not nearly as much as a well coded, FAST to load site which has great content.
Some of the top ranked sites for big keywords only have several hundred back links.. However, that is just because many people have next to no back links. They are and will be important for some time! They need to be do follow and relevant to your industry. Keep in mind that government and edu links give you more passing juice. The coding and tags do have an impact as well, but without back links you will not rank IMO.
While the link building is one of the strategy, you can try some white hat techniques for it, but need not have to emphasize more time or need not have to worry more about it. Here, UI/UX factor is more valuable than the links. Still, Google doesn’t have perfect yardstick for measuring this UI/UX factor too. When too many people visiting site from other sources like SMM and some more off page techniques, then Google got no other option than to keep your site at top based up on the user attention and making irrelevant your link numbers. If you want to beat your competitors and wanted to get top ranks, then UI/UX is your best choice.
by ThinkCode | Jun 8, 2015 | Search Engine Optimization
The sitemap not only helps for indexation and site discovery it also helps to de-index obsolete pages more quickly. In case a page no longer exists on the site (returning a 404) and in parallel also no longer is mentioned in the sitemap. Google will de-index it more quickly. I.e. I will re-crawl the page less often before it is convinced it is actually gone.
So it appears Google keeps “a state” of the sitemap and looks at differences.
That 2nd use of the sitemap makes it a powerful tool in combating index size issues and ensuring a domain’s “crawl capacity” is used on pages that the webmaster wants to get crawled.
HTML site map is for visitors and help them in navigating through the website (which Google likes). So both of them have importance in their own ways and helps your website in SERP ranking. Make sure your site maps include all pages and their sub pages. Make the site map like a index page of a book. It needs to be clear and logical.