SEO Guidelines for Developers

Now that you’ve chosen to rely on organic search results to drive your traffic, you need to remember that when coding your pages. SEO is about so much more than keywords and content marketing. In fact, a lot of the unseen technical aspects help decide where a page will show up in the Search Engine Results Page (SERP).

First and foremost, you have to make sure search engines have access to your site and their robots can find your page’s content. You can see how crawlers see your page in Google’s Search Console.

Once you make sure your site’s basic elements can be found, seen, crawled and indexed, it’s time to use these guidelines to make sure robots can understand your pages and how they relate to keywords. Most importantly, you need to demonstrate to these search engines what sort of user experience your site will provide.

Writing URLs for SEO

Clean URLs

A page’s URL is super important for user experience and SEO. It’s the first thing search engines crawlers see and should tell them the basics about the page and its content. Your URLs should be clean, easy to read, descriptive and free of URL parameters.

Take a look at these two URLs:

https://www.example.com/category/index.jsp?category_id=456&product_id=789&referrer=1011

https://www.example.com/category/product-name

The first URL will certainly confuse robots and human users because there’s no indication what category or product the page is for. When users encounter links to this URL, they will potentially be less likely to click on it, and search engines will have a lot of trouble figuring out the page’s topical relevance.

The second URL is much better — it is easy to read, tells you the category and the product you will find on the page, and doesn’t have any of those pesky query strings.

It’s pretty typical to end up with URL parameters because of analytics and tracking programs, or when your CMS gives dynamic page elements like filters or sorting. But, if you are using a more advanced CMS like WordPress, you can rewrite the URLs by modifying the permalink settings in the main menu.

Optimized URLs

Both the structure and the words you use in your URLs are crucial for SEO. The URL’s path helps search engines comprehend the page’s relationship to the rest of the site. The words you use in your URL will display how relevant that particular page is to a topic or to a keyword.

The elements of a well-optimized URL structure are as follows:

Short and Descriptive: Your use of keywords should describe the page’s content. If you are unable to use them in your URLs for some reason, keep the path as efficient as you can. Use as few words as you can, and don’t use any stop words (i.e. the, a, on, etc.).

Hyphens, not underscores: You should always use hyphens in URLs rather than underscores when separating words. Search engines see hyphens as spaces, so they can read a URL like this: example.com/fat-cat-in-a-hat. Underscores don’t translate to word separators, though, so this: example.com/fat_cat_in_a_hat, they would read as fatcatinahat. Nonsense.

Caption: Because we all wanted to see it after that URL example, admit it.

Keywords at the beginning: Your most important keywords should go at the beginning of the URL. Crawlers assign more value to those words – which is another reason to keep your URLs short. The fewer words in the URL, the more value each one has. Bear in mind, you need to use those keywords naturally. There’s nothing worse than Google seeing your site as low-quality or spammy. If you are targeting a long-tail keyword, consider removing the category from the URL.

If you optimize your URLs with keywords, it’s more likely the anchor text for your links will use relevant keywords as well.

Meta Tags

It’s important that your code creates a quality page for your users, but search engines also look at meta tags to learn things about your page. Even if you aren’t writing the meta tags yourself (this is usually a marketer’s job), you should still understand their role in SEO.

The three meta tags that are important for your SEO are as follows:

Title Tag

The title tag is a super important on-page SEO element, as it is probably the strongest signal you can give search engines about your page’s topic. Use your most important target keyword at the beginning of the title so search engines can see if it’s relevant to a query.

A well-optimized title tag uses maximum 60 characters, which includes spaces and punctuation. The ideal length is between 50 and 60 characters. If you use more than one keyword, or you are including your brand, use the pipe (the | character).

Your title tag should look something like this:

<title> SEO Rules for Developers | SEO Guidelines | BrightDesign </title>

Additionally, if you are optimizing your site for local search, it’s smart to include the location and industry in your title tag:

<title> Duff’s Brewery | Beer | Springfield | </title>

Meta Description

It’s true that meta descriptions aren’t used as a direct ranking signal, but they are still important for your SEO. Search engines will look at them to help decipher a page’s topic. They also combine with your title tag to form a search snippet, which is the title, link and page description shown in search results.

A good, enticing meta description works as a sort of advertisement for your site. Meta descriptions, when written for your audience, will draw users to your site, increasing your click-through rate (CTR) – which Google loves. It will also decrease your bounce rate if your page delivers what your meta description promised.

If it makes sense for your brand, you can use words like “sale” or “free shipping” to attract in-market searchers. That meta tag would look something like this:

<meta name=”description” content=”Your short page description that encourages page visits” />

Robots

The robots meta tag tells crawlers if they can or can’t index a page or follow that page’s links. This particular meta tag will stop search engines from indexing pages they found by following links on other sites (which is something you can’t prevent on your robots.txt file).

<meta name=”robots” content=”noindex”/>

You can stop search engines from following links on your site by adding the “nofollow” value to the content attribute. If your page has a lot of links that you don’t want to pass value, or if you are including paid links, use this robots meta tag:

<meta name=”robots” content=”noindex, nofollow”/>

Be advised: disallowing pages with your robots.txt file doesn’t mean you don’t need a meta robots tag. Google won’t index these pages but they might still show up in search results, replacing the description with “A description for this result is not available because of this site’s robots.txt”.

If you are using the meta robots noindex tag, do not also disallow the page in the robots.txt file – this will prevent the crawlers from ever seeing it.

Redirects

Developers are always moving things around a site, sometimes hosting it on a new URL and setting up redirects to send users to a new page. Redirects are actually good for SEO because search engines like when there is one canonical version of something.

If you have two or more paths to get to the same destination (like if you have moved content to a new folder temporarily, or if you copied pages to a subdomain), crawlers will get perplexed and treat your pages as duplicate content.

It’s important to use redirects on your old pages that point to the new versions, so that both humans and search crawlers land on the right place. Without redirects, search engines might serve the wrong page in search results.

A huge plus to using 301 (permanent) and 302 (temporary) redirects is that they pass full link juice to the destination page. You can move your content without suffering in terms of ranking or traffic. It’s also good for users because they won’t land on dead links or 404 pages. Google treats 302 redirects as if they were 301s, and they pass full value to the destination page.

Note: don’t use more than one redirect in a row. Redirect chains look suspicious and low quality to Google. It will also slow load time for users and is inefficient for your server.

Schema Markup

Schema.org markup is the best way to tell search engines what your page is about in language they can understand. You can use it lots of places on your site, like the About Us page to differentiate between address, prices, opening hours, etc.

Google uses this information in its Knowledge Graph rich cards and is a huge boost for your SEO. Want to see schema markup in action? Just search for your favourite recipe:

You’ll notice the search results still use the Title, URL and meta descriptions, but this time there are pictures of your dish and star ratings. It’s a simple concept – when search engines can understand your content, it’s easier for them to decide how it relates to different topics and queries. That understanding leads to higher rankings for relevant searches, and ultimately, gets you in front of your target audience.

Mobile Friendliness

Mobile friendliness is a known ranking signal, which is measured by a few different criteria. If you’ve already created your site’s mobile version, see if it’s considered “mobile friendly” with this Google test.

You can also run a website review to find out what needs to be adjusted. A review of your site will find issues that could be hurting your site’s mobile friendliness, like text size or tap target size.

Mobile Page Speed

Loading time is a really important part of mobile friendliness, considering 40% of users will leave a mobile page if it hasn’t loaded in 3 seconds. Google expects your above the fold content (ATF) to load in one second or less.

Numbers time: after the normal process of DNS lookup, TCP handshake and HTTP request, you’ve got roughly 400 milliseconds to load your ATF content. It’s not impossible, you can speed up your mobile page by doing the following:

Optimizing Your Images: Using HTML to reduce your image size will just change the appearance. Instead, use an image editor like Photoshop to reduce the image size.

Depending on Browser Caching: You can leverage browser caching to reduce your number of HTTP requests.

Minimizing ATF Content Size: The initial TCP connection isn’t able to fully utilize a connection’s bandwith on the first round trip, which means that number of packets it can send is super limited. To render your ATF content, it needs to be at 148 kb or less. It’s important to keep your server updated so the number of packets you send on the first connection isn’t limited.

Minifying Code: Remove any superfluous characters from JavaScript and style sheets. You can do this with YUI Compressor or JSMin. By minifying your code you can improve caching and also reduce bandwith usage.

Using Google AMP: Google stores pages with the Accelerated Mobile Page (AMP) markup in a cache and serves them instantaneously from there, and it can really impact your SEO.

If you are still having trouble with your page speed, use Google’s PageSpeed Insights. PageSpeed measures the performance of your page as mobile and desktop user-agents, evaluating the time it takes to render both ATF content and the entire page. There are other browser tools, like Developer Console for Chrome and Web Console for Firefox that can detect issues on your page.

Mobile Site Structure

There are three options to building the mobile version of your site: responsive design, mobile subdomains and dynamic design.

Responsive Design: This is the best choice because it is what Google recommends. You don’t have to make any changes to the code you already have aside from setting the viewport meta tag, which tells browsers to display a page based on the user’s screen size. It looks like this:

<meta name=”viewport” content=”width-device-width, initial-scale=1.0”/>

Dynamic Design: You will spend more time and effort on this than responsive design, because it requires detecting the user-agent and serving different HTML code to mobile and desktop browsers. To do this, you must implement the vary: user-agent HTTP header, which tells search engines that you will be serving different HTML for different user-agents.

To add vary-user agent in Apache, add this to your .htaccess:

Header append Vary User-Agent

For WordPress, just add this code in functions.php:

   function add_vary_header($headers) {
   $headers[‘Vary’] = ‘User-Agent’;
   return $headers;
   }
   add_filter(‘wp_headers’, ‘add_vary_header’);

 

To set vary user-agent via PHP, add this code:

  <?php
   header(“Vary: User-Agent, Accept”);
    ?>

Mobile subdomain: This method is the most time-consuming of the three because you have to build and entirely different mobile website and host it on a subdomain. It usually looks like mobile.samplewebsite.com or m.samplewebsite.com. Googlebot won’t actually be able to tell that these pages are meant for mobile users, so you need to place the rel=”canonical” tag to clarify the relationship between duplicate pages. This method is complicated, expensive and generally not recommended — especially for bigger sites.

Conclusion

Although it may seem like search engines are trying to trip you up, they really aren’t. Their main goal is to give users the best possible experience and most relevant pages based on their search intent. Which, incidentally, should be your goal as well.

Remember that there is a never a guarantee that you will end up with the number one ranking in Google. You can, though, optimize everything you can from a technical standpoint to intelligently and methodically move your site up in search results.

At the end of the day, it’s always a good idea to audit your site with a crawler to find undetected issues that could be impacting your SEO.

Developer coding