For a website to maintain it’s best footing in search engines, best practices in code, structure and usability need to be followed. Website optimisation is a lot more than just making sure the right keywords are in your page and Meta data.
Page speed, mobile-friendliness, HTTPS and Schema Markup are all important aspects that require careful implementation. This is alongside more basic tasks, such as maintaining internal links, out-of-date URLs and XML sitemaps. Technical, onsite SEO is now more important than ever and I haven’t even mentioned anything about usability!
Let me go into a bit more detail on the main aspects covered during onsite optimisation.
Time is spent using several of the best tools to find what people are searching for and how much potential your website has to appear infront of them. Data is collected on all the searches relevant to your website and industry. This task will show what searches are being made, how popular they are and how effective they will be to use.
Research is about more than receiving a list of keywords. The outcome should help you understand the content that needs to be created and the answers you can provide to the questions being asked online.
With keyword research and discussion, the basics start with making sure that the Meta titles and descriptions of each page are relevant, unique and provide a good call-to-action. Meta data lives within each page to tell search engines what they’re about and what to display in the search results. Without Meta data, Google will display a snippet of text based on what it deems most relevant from the content. This isn’t ideal as you are not in control of the message and you’ll be missing the opportunity to provide a convincing reason for people to visit your website.
Without implementing a Meta description, the description of my homepage would read something like this:
“…Digital marketing and web development, rossstevens_uk, Exeter, Devon, UK”
Such an automated description tends to repeat information and cuts text half-way through a sentence. Having a Meta description implemented allows the following text to appear in the search results instead:
“I have been working in digital marketing for over 15 years in Devon. Want to know what your website can achieve? Let’s talk.”
As you see, a unique Meta description describes where I am located, my service and a call-to-action to encourage people to get in contact.
Search-engine friendly URLs
The URL of a web-page should be understandable by users, represent the structure of the site and include the title of the page. An example of a non-search friendly URL would be:
When made search-friendly, the URL looks like this:
The above URL is ideal as it represents the site hierarchy, with the categories that the user has navigated via, in this case “services”. The last phrase in the URL represents a shortened version of the page name, “digital-marketing”. This not only helps a user understand where they are within a website, but also includes keywords helpful for search purposes.
bounce rates, length of user sessions, making sure content is using imagery and is formatted correctly based on the data being presented. eg, certain types of data should be using the relevant code for outputting lists or tabular data. forming content to help it appear as a featured snippet. implementing schema markup to increase the chances of the page appearing in Google’s Knowledge graph. For example, products or events listed on a website.
Checking interal links dont link to pages that no longer exist or rely on a redirect that might not be passing page authority. links to third party websites are also checked regujlarly to make sure they still exist and their URL hasnt changed. Also, these can be checked to decide what external links should pass authority or not.
Checking it displays on different mobile, tablet devices, The page speed is as good when browsing via 3g/4g as it is via a higher bandwidth connection. The site uses a burner menu to allow mobile users to easily navigate the website. Call-to-actions are obvious and non-instrusive.
It is now well known that the speed of your website is a vital factor in how well your site performs in search engines. A fast website also enables users to navigate and consume your data without causing delay and frustration. To make sure your site loads quickly, I analyse the code on your site and the media files embedded.
To improve speed, code is refactored and invalid HTML errors are resolved. If many different files are used, I look at caching methods to implement so that the most popular pages and files are retrieved from memory rather than having to be loaded from the web server each time. A popular cause for slow loading times is having large image files embedded. I also look at these and see if their size can be reduced with compression.
Additional SEO attributes
Where necessary, other onsite related tasks are also undertaken. This could be in the form of heatmapping software to monitor specific areas of the website, or implementing Google’s Search Console for diagnosis. I also use several in-house and paid tools for monitoring the health of a website.
Alongside traffic monitoring, I also implement tracking on elements that represent key engagement areas, such as brochure downloads, contact form enquiries and image gallery interaction. Phone call tracking can also be setup to know when someone picked the phone up because of finding the number on your website.
The above is just some of the aspects I look at and maintain on a monthly basis. Website optimisation is key to getting your website crawled, indexed and visited by a regular audience naturally via search engines. To build upon this success you’ll also need to consider off-site efforts such as PPC, content marketing and social media management.