The organic search engine optimisation game has changed a lot over the last couple of years. Techniques of years gone by have been wiped out with algorithm changes such as Panda and Penguin. These algorithm changes cracked down on a lot of what Google deems to be webspam, but also reminded SEOs, content writers, and general web all-rounders to focus on content that’s useful to the end user.
To help you optimise your websites, I’ve compiled a list of what I believe are the key aspects of Search Engine Optimisation in 2013. This covers the sorts of things Google likes and dislikes, as well as some tips, techniques and tools to help you.
What Google Likes
The single best way to get decent rankings is to produce incredible amounts of original, timely, quality content. A static site isn’t enough any more. You need to push stuff out. It needs to be good. All of my SEO work revolves around this.
2. Inbound Links from Quality Sources
You can get reasonable levels of traffic with very few inbound links, but you’ll hit a limit sooner or later. So, what do you do?
If you were black-hat, you could manufacture a whole web of your own sites to try and game this. But given the ongoing algorithmic changes, you need quality inbound links from others. Generally you achieve this through creating content worth sharing, but you could also help this along by guest-blogging on other sites and getting an in-bound link through that. You could also incentivise people to write quality reviews about you by offering them something free (a free subscription would be the most popular).
3. Social Media Activity & Popularity
Social Media is still fairly new, and it’s hard to gauge what impact this has at the moment on search results. I’m fairly sure that Twitter and G+ do influence results somewhat, but services such as Facebook are used a lot less still. It’s worth maintaining social media profiles as part of a broader marketing strategy, but I suspect doing so will help with SEO.
4. Keep Your Focus on Specific Topics
By focusing your content efforts on specific topics, you will be able to strengthen your rankings across all of them. My logic goes like this: If one page is popular, another page of your on a similar topic would also deserve more traffic as it is likely to be of similar quality.
Although I can’t be sure, I’m fairly sure Google builds and maintains a comprehensive relationship map of every conceivable topic on the internet as a part of their search crawling. The more focused your content is on a particular area on this map the more “on topic” your site is deemed to be and the better your rankings do. If you are more “eclectic” in the topics you cover and spread out over further nodes on this map, the more likely your site is to be spammy and less relevant to rankings.
5. Domain age
I’ve found domain a year or two old performs significantly better than a brand new one, and I’m at the stage where I almost have data to back it up. I’m getting to the point where I’ll register a domain if I have a good idea and maybe come back to it by the time it’s up for renewal again. This way I get some domain age for the eventuality that I actually develop the idea.
This being said, sites parked as spammy holding pages may not be rewarded as much. If you have the choice, create your own holding page with a quick paragraph about your intended idea (and a contact form).
Post new content regularly. Some people say a new site should have one quality post per day for the first month. I say one quality post a week for at least 6 months. After 4 months I start to see a reasonable traffic bump. With this website I do the lengthy quality post each Monday. If I have something smaller to share it goes out on a Thursday. There is no logic behind my days – I just picked something and stuck to it. I suggest you do the same.
What Google Hates
1. Web-spam: Anything dodgy or low quality
Google has become very good in the last couple of years at detecting low quality stuff on the web. By reading the SEOMoz Algorithm change log, you can see the sorts of things being targeted over time. The focus is always on things deemed lower quality or linked to black-hat techniques.
This list of low quality stuff includeds but is by no means limited to:
- Duplicate content
- Keyword stuffing
- Low quality inbound links
- Link exchanges
- Un-disclosed paid links (disclose as text and also with rel=”nofollow” for the bots)
- Garden variety spam
- Anything deemed “Black hat”
- Social network follower stuffing / fake accounts (I think – but cannot confirm – that gaming your follower count could indicate broader malpractice so Google penalises it if detected)
- Content masking (showing different content to Google than to regular users)
- Low quality site design
- Suspicious link structure
- A bajillion other factors
2. Duplicate Content
Major algorithm changes last year and the year before impacted people who were:
- Scraping others’s content and passing it off as your own – think: forum & Q&A site ripoffs
- Re-using their own content across multiple sites
I think gaming this isn’s as simple as replacing words with their synonyms or swapping the sentence structure. I suspect there’s a fancy heatmap-style algorithm in play (similar to how YouTube detects copyright infringing video) that can detect and match patterns on a page-wide basis.
How to succeed in a Google dominated Internet
1. Write content
Simple: Blog, create articles, publish videos, write detailed product pages, etc.
2. Re-work your squeeze and landing pages
You almost can’t get any organic search results with near-zero-content pages or sites. This is why internet startups, affiliate marketers, or anyone using some sort of signup squeeze page are struggling with their pages. The trend now is to move to long-form pages which just inundate visitors with feature descriptions, testimonials, use cases, and anything else that can lengthen out the page and make it carry more value. Ultimately this is good for conversion rates too as visitors convert for different reasons and this can tick off multiple boxes in their minds.
3. Longtail keywords
My strategy to drive traffic is to focus on longtail keywords. I wrote about this last week, but basically I don’t care about highly-competitive keywords. Others can fight over those while I’m pulling in serious traffic through other means.
The ‘trick’ is to write on topics which can some way link back into what you’re selling. Example: when I started my website, I blogged on some general IT support topics. These won’t ever convert to a sale, newsletter signup, or return visit because of the nature of the traffic. What will convert for me is quality articles on broadcast technology (a very low competition area in general), WordPress nitty-gritty topics, and a number of other posts such as this one.
4. Monitor Numbers and Respond To Them, But Don’t Live in the Short-Term
I use several tools to help me work out how people are finding me, what they are doing once they’re on site, and other related areas I could write about. I use these on a weekly basis to see what’s happening, and perhaps every couple of weeks I might write a post in response to them. At this stage, I still need to break new ground myself and don’t let the numbers kill and idea.
Don’t let these sorts of tools drive all of your content, or else it will become stale and formulaic. You are the best authority on your topics, and only you know the sorts of topics might be worth covering. The tools only show related areas you may have missed.
5. Optimise Titles and Descriptions
Custom write all of your page titles and descriptions. Think about it as part link bait, part keyword optimising. There also needs to be a strong correlation between your <title> tag, your <h1> tag, and your opening paragraph. Stack Overflow actually uses the main keyword as their first word in their <title> tag – this seems to drive results for them, but I don’t have my own experience to back this up.
Titles and descriptions are mainly to entice users to clicking on your link in the SERP, but titles bearing no resemblance to the content will probably be penalised.
Tools to help you
1. Google Analytics
This is best for analysing what happens with a visitor once they are on their site. Since SSL was enabled for all Google Accounts users, you can’t see most incoming keywords as the referrer isn’t passed when using SSL. Bummer. No real way around this. Just use it for on-site analysis and use other tools to track search terms.
2. Google Webmaster Tools
This shows you all the keywords you are showing up in SERPs for, plus the click through rate. Using this you can tweak and optimise your <title> and meta descriptions to try and push up the click rate. It also shows you all the keywords you show up for. I find I rank well for the most peculiar keywords, but have low click through on these. If they are relevant: time to optimise. If they aren’t: move on, not worth trying to get something out of it.
Hittail is a decent tool for showing you stuff you mightn’t have though about before. Handy to try for at least the 30 day trial period, but useless if you don’t have any content yet. You need at least 1000 monthly uniques before it starts working at all. I recently passed this, and am starting to get some reasonable suggestions.
Remember that it’s also limited by the SSL referrer issue mentioned above, so it’s going to miss a lot things. If there were a Hittail which integrates with Webmaster Tools, then that would be neat.
4. Google Adwords Keyword Tool
This shows you the average monthly local and global searches for keywords. I will do many searches in this before writing any substantial post or website to see the best angle to come at things form. Once a term surpasses a few thousand monthly searches, it gets hard to rank easily (still achievable, but not like shooting fish in a barrel).
5. Follow Matt Cutts
What SEO techniques have worked for you? Share in the comments below. Also, please subscribe to my newsletter to get more posts like this.
Get the Broadcast Technology Newsletter
Sign up for the email newsletter about media and technology. Sent irregularly. No spam.