The technological part in SEO can refer to many things as it is as well close to what we call accessibility.
To make it simple, "Will bots and visitors will have a good experience when browsing your website or not?"'
Most of the time SEO practionners are not really taking care of this part for two reasons:
- it is technical, as it is a bit complex, it requires more knowledge so they outsource this part to someone else.
- they use technologies which are already clean enough in order to avoid potential technological issues. WordPress is a good example, a well known Open Source CMS, as a result the community is really active and providing efficient SEO features.
Let's see a little bit more in details what are those technological criterias.
A simple thing to understand, if one cannot access to a web page, so as the robots from search engine. So imagine that in order to access a web page on your website, you need to enter a login and password, robots won't be able to access it.
The robots.txt file
The robots.txt is not the only way to block a robot from coming to your website, but it is one of the easiest way to explain the accessibility, so we will use it as an example. As its name stand for it, this is a file webmasters are inserting on their website in order to talk to robots. Within it they are explaining to robots what they expect them to do in terms of browsing.
A robots.txt file seems kind of technical but you will see it is not. Below is the robots.txt file of https://yacy.net/ web page:
User-agent: * Disallow: User-agent: devtest.yacy Crawl-delay: 0 Sitemap: http://yacy.net/sitemap.xmlThe first line "User-agent: *" means that the lines below it will concern all the user agents. The * called "wildcard" means all. It is what we call a special caracter in regular expressions. Regular expressions are patterns used in programming language in order to automate some tasks. A user agent is the name of the technology make the request. So here it means either you are Firefox, Safari, YaCy bot, Google bot or whatever then follow what I am saying on the next lines.
The next line is saying "Disallow" so it means I don't give you the authorization to do this. Here there is nothing after : so it means that there are no restrictions, so user agents are free to index whatever they want.
Next line is "User-agent: devtests.yacy" so here we are addressing only to one technology. And we are telling that there is no crawl delay.
Then we are informing that there is a sitemap which is hosted at the following location which has all the URLs to crawl.
There are standards on the web, and as a result, there are websites listing those standards and telling you if a web page has been properly written or not. There are two tools you can have a look at:
- W3C validator: https://validator.w3.org/
- CSS validator: https://jigsaw.w3.org/css-validator/
When one is refering about text content in SEO, it refers to text visible by robots. For example when something is written within a picture it is not real text content for robots. It requires extra effort that most robots cannot process. As a result the text written within the pictures cannot play a role within the ranking of the content. That's the main reason why it is a good practice to always fill the text fields associated to a picture such as:
- name of the file.
- alt tag.
- title tag.
- caption tag.
What is true for pictures is also true for other technologies such as Flash which are reprocessing the text content in order to make it fancier.
To make it simple, to have your text content taken into consideration, use regular text.
In order to find your content, search engine robots need to be able to access it. To access it they are following links, either the one you are providing them or some provided by others. The more links you have to your pages, the more chances robots will visit your pages and index/rank them properly. So as a good practice always make links to your page and always have at least one link pointing to your page if you want it to be index one day. A page without any link pointing to it is called an orphan page.
Adapted for mobile
In order for mobile users to have a good experience on your website, you need to design your web pages in order for them to automatically fit small screens.
Tools to help you
There are many browser add-ons out there which will help you easily identify if your website has the minimum requirements in terms of technology for SEO.