SEO practical methods of website for structural design optimization, do you know 404 and 301?

After optimizing the URL hierarchy and navigation structure of the website, today we will continue to discuss the rest of the site optimization.

Site maps, robots.txt files, 404 error pages, 301 redirects are very important aspects of site optimization.

First, let's look at the sitemap. There are two versions of sitemaps: the HTML version and the XML version. the HTML version is used to index the content of the site and is made for users, while the XML version is made for search engines and is used to centralize the submission of links to the site in order to improve the overall crawling efficiency of the site.

The way to make a sitemap is to use the Love Site SEO Toolkit. First of all, you need to download and install the toolkit, and then select the toolkit sitemap \\u002Fsitemap option to add URLs. in the crawl URL type, select the type of static URL suffix, in the XML settings, select the sitemap format, and select. xml and. html suffix. After clicking Crawl, the toolkit will automatically generate a map file.

After making the following two different versions of the map data file, upload them to the root directory. Then, it is recommended to add xml map system homepage main entrance at the bottom navigation, usually you can use A markup to place the markup at the bottom navigation.

The last step is to submit to search engines. Submit the map file on the search engine webmaster platform. Since the site will be updated frequently,ordpress services page it is recommended that the map be updated and submitted at least once a week to inform the search engines that the site has been updated so that the spiders will actively crawl.

Next, let's learn about the robots.txt file. robots.txt file can be understood as a website robot or spider protocol, is the first file that search engine spiders read when crawling the website. This file can only be placed in the root directory.

General Robots. TXT commands include User-agent (specifies search engine rules), Allow (which directories or files are allowed to be crawled), and Disallow (which directories or files are not allowed to be crawled). An asterisk denotes a wildcard for all. It should be noted that the root directory and frequently updated directories cannot be disabled from search engine crawling, as long as the root directory and frequently updated columns do not affect SEO.

In addition to these commands, the address of the sitemap should be specified in the robots.txt file to increase the efficiency and frequency of crawling the entire site.

Next is the 404 error information page. 404 error carry out page works by reducing the business users to be able to bounce rate and search engine spiders lost rate. The requirement of making 404 error management page is to include a link back to the website, which can be returned directly to the home page or section page. At the same time, can not meet the use of having a forced jump system function, which is not friendly to our search engine.

The production of 404 error page is to pick up other sites 404 error page code, replace the return link, title and text, etc.. A 404 image is then made using PS and packed into a folder and uploaded to the root directory. Finally, set up the 404 function on the host and specify the folder address of the 404 error page to complete the setup.

Finally, 301 redirection. 301 redirection refers to redirecting multiple website domains to the primary domain name to realize weight import, which is applicable to changing domain names. Some companies register multiple domain names with different extensions to protect their brand. In order to keep the domain name idle, they will make all the domain names into websites with the same content, but this is wrong and will lead to search engine sidelining. The easiest way to do this is to make only one website and redirect the other domains to this website's domain.

In short, maps, robots. txt files, 404 error pages and 301 redirects are very important aspects. Creating a sitemap and submitting it to the webmaster. txt file commands, setting up 404 error pages and 301 redirects are key points to keep in mind.


精選文章:

Learn the following six methods to help you optimize your SEO website

How to think about SEO optimization from an operational point of view?

Search engine optimization website construction practice method, yo...

Views: 3

Comment

You need to be a member of On Feet Nation to add comments!

Join On Feet Nation

© 2024   Created by PH the vintage.   Powered by

Badges  |  Report an Issue  |  Terms of Service