SEO PowerSuite
Too many webmasters think of SEO in terms of things you do after a website is created, whether that’s optimizing specific on-page variables in order to maximize the odds of being ranked for particular keywords or the process of soliciting backlinks from qualified sources to power off-page SEO.
However, ignoring the important role that your site’s coding can play in its overall optimization is essentially the same as building your home on an unstable foundation. Consider all of the following ways that coding can help your site’s SEO in order to avoid missing out on these critical benefits.
Tip #1 – Validate your code for search engine spider accessibility.
Keep in mind that the search engine spider programs have some serious limitations when it comes to crawling and indexing your site. Since they really only read text effectively, other elements on your site, including image, audio, video, and script files, can all prevent important site text from being crawled and indexed appropriately.
To see for yourself exactly how the search engines interpret your pages, use the Webconfs “Search Engine Spider Simulator” tool to review your website. If you notice that chunks of text are missing from your pages, validate your code correctly so that the search engines are able to find your information.
Tip #2 – Use coding to create SEF URL rewrites.
Creating search engine friendly (SEF) URLs is beneficial from both an SEO perspective and in terms of the user experience. The specific way you’ll need to modify your site’s code in order to minimize the number of extraneous characters and codes that are present in your URL will depend on the specific platform your site runs on. If you use WordPress, Joomla, or any other CMS, you should have access to plugins or internal dashboard settings that will allow you to make the necessary changes. In some other cases, particularly when it comes to open source ecommerce platforms, you may need to address your permalink structure within your .htaccess file.
SEJ SUMMIT. For SEOs. By SEOs.
Join digital marketing experts from GOOGLE, ESPN, ZILLOW, CONDE NAST, and more in New York this November.
Join digital marketing experts from GOOGLE, ESPN, ZILLOW, CONDE NAST, and more in New York this November.
Tip #3 – Clean your code to facilitate site speed improvements.
Although your site’s code might start out “clean,” over time, it’s common for website modifications to result in a number of different errors that can slow down your site’s operation. For this reason, it’s a good idea to perform regular checks that account for all of the following issues:
- Eliminate excess whitespace, while still keeping your code readable
- Use an HTML validator to eliminate broken and unpaired tags
- Use a broken link checker tool to remove invalid URLs
Tip #4 – Serve text-based alternative to on-page scripts.
As mentioned in Tip #1, the search engines can’t usually access information that’s contained within image, video, or script files. However, as these elements can go a long way towards improving the user experience on your site, it isn’t a good idea to eliminate them entirely.
Instead, a better alternative from a coding perspective is to serve up alternate, text-based versions of the information you’d like the search engines to index. As an example, when serving up Flash files, consider using the SWFObject2 library, which will automatically deploy alternate text-based content whenever it detects users or search engine spiders that can’t process these file types correctly.
Tip #5 – Set up “noindex” tags on your robots.txt file.
While there’s no way to control the behavior of the search engine spiders with 100% accuracy, telling them not to index certain pages on your site through the use of “noindex” tags on your robots.txt file can be useful from an SEO perspective. This tag should be added to your robots.txt file for any pages that shouldn’t appear in the search results, including:
- Shopping cart and checkout pages
- User dashboard pages
- Archive pages
- Contact pages
Tip #6 – Use “rel=canonical” to deal with duplicate content issues.
If you use a CMS program like WordPress, Magento, or Joomla to build your site, chances are you’ve got duplicate content issues that result from the way these platforms create URLs. Whenever you add a new post to your website, it’s not uncommon for these systems to automatically generate any or all of the following options:
- Yoursite.com/post-name.html
- Yoursite.com/category1/post-name.html
- Yoursite.com/category2/post-name.html
- Yoursite.com/archive/date/post-name.html
Because all of these different URLs redirect to the same page, you risk being subjected to duplicate content filters within the search engines if you don’t specify exactly how each of these URL variations should be treated.
The best way to instruct the search engines on how to handle your URLs is through the use of the “rel=canonical” tag. This feature can be added to the <head> section of your website either by hand or through the use of a plugin and tells the search engines to disregard, redirect, or index a given page for the specified URL.
Tip #7 – Set up 301 redirects to ensure proper PageRank flow.
When it comes to setting up proper 301 redirects, there are two coding situations you’ll want to consider from an SEO perspective. First, use this code to inform the search engines that both www and non-www versions of your URLs should be treated the same.
Second, if you ever move content within your site (for example, if you change the title and permalink of a blog article), create a 301 redirect to inform the search engine spiders of the move. Doing so will minimize the potential loss of PageRank that can occur when backlinks no longer resolve to your former URLs.
Tip #8 – Use microdata to create rich snippets.
One recent addition to the SEO developer’s toolbox is microdata, a new language that allows you to add several levels of clarifying data to your site’s HTML tags. Not only can these elements help your site to be indexed and ranked correctly, they can also boost your clickthrough rates from the SERPs through the creation of enhanced search result listings known as “rich snippets.”
As there’s some speculation that overall clickthrough rates from the Google SERPs are being weighted as a ranking factor, adding these new features may help a site’s SEO in addition to driving extra traffic from the search results.
For more information on what rich snippets are and how to create them through the use of microdata, check out Schema.org.
Tip #9 – Combine script files to speed up load times.
Recently, site loading speed has gained significant importance as a search engine ranking factor, based on Google’s stated desire to reward fast sites in the search results.
Unfortunately, if you’ve built out your site using tons of different scripts in order to provide additional functionality for your visitors, loading all of these various code files at once can bring down your site’s performance substantially. By combining these individual code sheets into a smaller number of files, you’ll minimize the long load times caused by excess script demands and improve your site’s overall SEO.
Tip #10 – Utilize CDNs to minimize necessary launch resources.
Finally, if you’ve made all of the possible modifications to your website’s code, and you still haven’t been able to achieve measurable improvements in your site’s load times, consider utilizing a content delivery network (CDN) to serve up content from external websites in order to minimize the total resources needed to launch your site.
CDNs like Amazon’s popular S3 service or RackSpace are an especially good idea if you host a large number of images, audio files, or video files on your site. If you feel that excess file demands may be dragging down your launch times, look into serving up remote content via CDN services. They’re often quite cost-effective to use, and they can make a big difference in terms of your site’s overall SEO.
No comments
Thanks For Your Comment!
Note: Only a member of this blog may post a comment.