888-983-5432

17 tips to deploy effective structured mark up

17 tips to deploy effective structured mark up

Semantic Seo Solutions

semantic SEO

Step #1

DEPLOYING EFFECTIVE STRUCTURED MARK UP

Prior to adding structured markup to your website, it is imperative all pages are in complete compliance with SEO best practices and that your site displays excellent SEO features.

PURPOSE: This article shares all of the SEO best practices required and explains why it is so important to have an SEO-healthy site before adding structured mark up.

Semantic SEO Solutions


SEO site health is critical path to the effectiveness of structured data markup. For example, if the website has “connection failed” server errors, the bots may not consume your structured data markup. This article will provide you with a checklist of items and helpful resources for measuring and correcting your SEO Site Health prior to adding structured data markup.

SEO Site Health Checklist

The following list represents the majority of SEO-related issues we believe search engines would like to see in compliance prior to deploying structured data markup.

SEMANTIC SEO - article 2

One cannot expect to have 100% of all issues resolved; however, this list will give us a baseline from which to begin. Any items left over or incomplete should be placed into a future IT roadmap to be completed at a later date.

Some of the items below require “auditing software” or a “crawler” to check/verify all pages on a scalable level. In a future article, we will provide a list of 3rd party services for checking things like Title and Description tags site wide.

Also, some of the tasks below can be checked using Google Webmaster Search Console (GWT). GWT offers some good basic information on server errors, robots.txt
and other technical factors which can hinder your website rankings. Removing these barriers can allow the search engines more efficient use of their time, and they appreciate that.

Sir Tim Berners-Lee, the inventor of the hyperlink, envisions a noteworthy transition we should all pay close attention to when he stated that we are currently moving “from a Web of documents to a Web of data.

SEMANTIC SEO - article 2 - site health computers

All websites must have a working GWT account before we can start a process of checking SEO site health. You will need the client’s credentials (login) for GWT and Bing.

1. Check home page status code and set preferred domain


Importance: Assigning a preferred domain name prevents search bots from thinking you have more than one website and directs them to one domain name for your
business.

The www.domain and other versions of the domain must be 301 redirected to the non-www domain (or vice versa).

https://support.google.com/webmasters/answer/44231
301 is imperative (mandatory)
302 is unreliable (temporary) and does not transfer link value

2. Check website pages server status code

Importance: A 302 redirect code does not pass any authority or trust the page may have had to a new page. It is particularly harmful to use a 302 redirect for a home page. Always use a 301 permanent redirect to ensure all


the authority and trust built up over time is being transferred to the new page. Web pages redirected within the website must all be 301.

https://support.google.com/webmasters/answer/35769
301 is imperative (mandatory)
302 is unreliable (temporary) and does not transfer link value

3. Check website for 404 “custom error page.”

Importance: The lack of a custom 404 error page misses the opportunity to provide bots with links to your important pages, and can confuse users who may abandon the website entirely.


A good 404 custom error page will keep the bots moving through the website efficiently and provide users with helpful information to get back on track with their research, purchase, desired action, etc.

404 must render a “custom 404 error page”
easy to test by typing in domain.com/whatever
https://support.google.com/webmasters/answer/93641

4. Check server error status for other errors (use GWT).


Importance: When a bot cannot connect with a web page/swell, it may decide to leave. When this happens repeatedly, you are training the bot to leave your page alone because the website or web page/s are broken. It is imperative to remove these errors to the best of your ability, allowing the bots to easily crawl, index and rank your entire website.

Verify pages are “200 ok” status
Check for “connection failed”
Check for “internal server error”
Identify and remove/fix 500, 501 errors
Check for Meta robots and Meta refresh tags

5. Check Robots.txt (use GWT).

Importance: The robots.txt file is the first stop for a search bot. It will make or break any future crawl activity with your website. It is the equivalent of a handshake when you meet someone; you want to make a good impression with

a firm strong handshake, e.g., properly coded robots.txt file.

Verify bots are allowed to crawl website.


6. Keep percentage of non-HTTP URIs/URLs to a good minimum

I.E., naming schemes such as ftp: , mailto: , tel: , etc., are not advisable to proliferate (as these may change).


File locations and mail addresses change,and telephone numbers change often. By keeping this class of naming schemes to a minimum, there is less likelihood of change conflicts or inconsistency.

7. Keep URLs stable and persistent…

…as changing URLs later will break any already established links.

URLs should not reflect implementation details that may need to change at some point in the future.
For example, including server names or other indicators of underlying technical infrastructure in URIs is undesirable.

Avoid using port numbers in URLs as these may change.
The mod_rewrite module for the Apache Web server, and equivalents for other Web servers, allows you to configure your Web server such that implementation details are not exposed in URLs.


8. Keep URI lengths short and mnemonic


Short, mnemonic URIs are useful for human website users, and many search engines prefer short URI lengths as they facilitate efficient indexing.

9. Site Speed is important – fix all issues

Importance:Site and page load speed affect a bot’s ability to consume your entire website efficiently with minimal resources. This results in better indexing and ranking capabilities.

10. Rel=canonical link tag


Importance: The rel-canonical tag is one easy method for preventing duplicate content which search bots will greatly appreciate. It tells the bots a particular page is very similar (almost identical) to another page, and is the primary page to consume, index and rank.

11. Make sure your pagination is…


…properly coded pagination for search bots to follow deeper content, e.g., page 1 of 10, page 2 of 10, page 3 of 10, etc.

Importance: When a bot cannot crawl past Page 1 to the additional related page/s, it becomes confused and may not consume, index and rank all of your pages. The bot may determine your website lacks content and/or may reach a dead-end and leave your website.

12. Title tags

Importance: The Title Element, commonly known as the Title tag, is a search bot’s best friend. It tells the bot what the page is about. The bots assume a Title tag describes the main theme of the page and uses this information in the search engine result page (SERPs) to describe your website or web page. Users rely on your Title tag by reading a short description of the site or page to determine if they want to go there or continue searching for something else.

For obvious reasons, the Title tag is the most important of all meta tags.

Scan all pages to verify title tags are unique, present or missing and/or duplicated
Meta title length should be under 70 characters, but optimally under 55, and pixel width ideally under 500

13. Meta Description


Importance: The Meta Description tag, is a search bot’s second best friend. It tells the bot in more detail what the page is about. The bots assume it further describes the main theme of the page, and uses this information in the search engine result page/s (SERPs) to describe your website or web page in detail. Users rely on your Description tag by reading a short detailed description of the site or page to determine if they want to go

there or continue searching for something else. For obvious reasons, the Title tag is the most important of all meta tags.

under 155- 160, (or less)

14. H-1 and H-2 tags

Importance: The Header tags, commonly known as H1, H2, H3, etc., help the bots understand the structure and meaning of the content following the tag. It tells the bot in granular detail what that section of the page is about.
For obvious reasons, the Header tags are among the

most important meta tags that help bots understand the content and context of words on a page.

Check for keywords in tags and verify that the tag length meets SEO requirements.


15. Body text on page/s

Importance: The body text adds context around the Title, Description and Header tags. Of course, it is also the information and the message you are providing to users reading your page/s. It helps when all of these criteria provide both bots and users with consistent information and messaging. Bots will reward pages with unique, well-written body text on all pages.


Check body text on pages and verify that the number of words on page meet SEO requirements
Unique on-page content is important
No duplicate content from other sites

SEMANTIC SEO - article 2 - pile of coins

The following two tasks require additional budget

SEMANTIC SEO - article 2 - grey arrow

16. All the above items should be checked for Mobile


Site can be verified using Google Mobile testing tool.

Importance: When Google recently launched an algorithm to favor sites that are “mobile-friendly,” the SEO community labeled this as “mobilegeddon.” Google estimates a 65% market share of U.S. Internet searches want sites to load quickly and be easy to navigate on a mobile phone.

Therefore, it is no longer possible to ignore mobile searchers and mobile websites; you are competing with those who optimize for mobile, and Google favors those who have mobile optimized sites

https://www.google.com/webmasters/tools/mobile-friendly/

17. HTTPS with properly working SSLcertificate is important.

Importance: In August 2014, Google addressed HTTPS as a ranking signal and clearly states, “We’ve seen positive results, so we’re starting to use HTTPS as a ranking signal.”

SEMANTIC SEO - article 2 - 17
If/when client approves this work, implement HTTPS
This requires a lot of work and will require collaboration with the client
Summary & What’s Next

Prior to implementing a semantic strategy with structured data markup, a website must be in excellent SEO condition. All technical and editorial factors hindering website rankings must be resolved. Search engines must be able to freely crawl the website, which should have fewer than 5% server errors and all server-configuration issues resolved. Content on the website must be unique and follow all search engine guidelines.

Once SEO site health is achieved by ensuring the majority of your SEO-related issues are in compliance, you are ready to implement structured data markup.

What follows in this series of 10 articles on Semantic SEO is a discussion of five useful tips for implementing structured data markup.

Please follow and like us:
Follow by Email
Facebook
Google+
http://semanticseosolutions.com/17-tips-deploy-effective-structured-mark/
Twitter

Submit a Comment

Calculate My REVENUE LIFT!

Subscribe To Our Newsletter

einstein - 02 - Copy
Newsletter

Thank you for subscribing to our newsletter!

Subscribe To Our Newsletter

Subscribe To Our Newsletter
Thank you for subscribing to our newsletter!

You have Successfully Subscribed!

Loading...