background preloader

Build and submit a sitemap - Search Console Help

Build and submit a sitemap - Search Console Help
Related:  SEO tools

Keyword Shitter - Bulk Keyword Ideas Tool Webmaster Guidelines - Webmaster Tools Help Wenn Sie die unten stehenden Allgemeinen Richtlinien von Google einhalten, können wir Ihre Website leichter finden, indexieren und platzieren. Wir empfehlen Ihnen dringend, auch die Qualitätsrichtlinien weiter unten zu beachten. In diesen Richtlinien werden einige der unerlaubten Verfahren beschrieben, die zur endgültigen Entfernung einer Website aus dem Google-Index oder zu einer sonstigen Beeinträchtigung durch automatische oder manuelle Spammaßnahmen führen können. Allgemeine Richtlinien Stellen Sie sicher, dass alle Seiten auf der Website über einen Link von einer anderen auffindbaren Seite erreicht werden können. Google beim Finden der Website unterstützen Erstellen Sie eine hilfreiche, informative Website und verfassen Sie Seiten, die den Inhalt klar und eindeutig beschreiben. Verwenden Sie zur Anzeige wichtiger Namen und Links oder wichtiger Inhalte keine Bilder, sondern Text. Qualitätsrichtlinien Matt Cutts spricht über manuelle Maßnahmen gegen Webspam. Grundprinzipien

Site Search in Google Analytics - With or Without Query Parameters By Samantha Barnes / February 24, 2015 Google Analytics comes with a lot of features “out-of-the-box”, but one of the reports that you will need to configure is the site search report. If you haven’t used this feature yet and you have a search box on your site, keep reading! The site search reports provide data on what type of content people are looking for on your site. For most websites, you can set up the site search reports entirely within Google Analytics, without needing to modify anything on your website. The Site Search Reports You will notice that the Site Search report is located under Behavior in the left-hand navigation and not Acquisition, where organic and paid search data is located. Site search tracking is not automatic like pageview tracking because there’s a wide variety of site search engines that function differently and because not every site has an internal search or wants to track that! Find Your Search Term Site Search with Queries Benefits

URL structure - Webmaster Tools Help A site's URL structure should be as simple as possible. Consider organizing your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers). For example, if you're searching for information about aviation, a URL like will help you decide whether to click that link. A URL like is much less appealing to users. Consider using punctuation in your URLs. Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. Common causes of this problem Unnecessarily high numbers of URLs can be caused by a number of issues. To avoid potential problems with URL structure, we recommend the following:

Backlinks SEO Learn about robots.txt files - Search Console Help The basics of robots.txt files: what they are and how to use them What is a robots.txt file? A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page. What is robots.txt used for? robots.txt is used primarily to manage crawler traffic to your site, and usually to keep a page off Google, depending on the file type: I use a site hosting service If you use a website hosting service, such as Wix, Drupal, or Blogger, you might not need to (or be able to) edit your robots.txt file directly. To see if your page has been crawled by Google, search for the page URL in Google. Understand the limitations of robots.txt Before you create or edit robots.txt, you should know the limits of this URL blocking method.

Site title and description - Webmaster Tools Help Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. The goal of the snippet and title is to best represent and describe each result and explain how it relates to the user's query. We use a number of different sources for this information, including descriptive information in the title and meta tags for each page. We may also use publicly available information, or create rich results based on markup on the page. While we can't manually change titles or snippets for individual sites, we're always working to make them as relevant as possible. Create descriptive page titles Titles are critical to giving users a quick insight into the content of a result and why it’s relevant to their query. Here are a few tips for managing your titles: As explained above, make sure every page on your site has a title specified in the <title> tag. How snippets are created

Social Status Checker - Check Social Media Metrics Nowadays, social media has become a very crucial piece of the overall puzzle of SEO, which makes it very hard to ignore. Comments, like, follows, shares, and Google+ are all known to have a positive impact on the overall ranking of a blog or website. The process of getting website traffic through social media websites is known as social media marketing. It is usually based on the efforts to create quality content that appeals to more readers and attracts their attention which leads to encouraging the readers to share on their social networks. Content is King, the better your content is, the more visitors will read, share, and link it to other websites and the search engine will start loving your blog or website. Content is King; this phrase describes the context of optimizing a website for better ranking in search engines with valuable content, especially the text of a web page which is the most important ranking factor in search engine ranking.

What Does a Slash at the End of a Website's URL Mean? One thing that all websites have in common is a URL, or "uniform resource locator". This is the fancy way of saying the "website address". Every site has a specific address that people will use to visit the site, similar to how every phone has a specific number that people would use to call it. One interesting aspect of URLs is the trailing slash that you often see at the end of the address. Go ahead and copy a URL from somewhere, like from a Facebook post or perhaps from a website article like this one. Most marketers will leave the ending slash off the URL when they post it because it looks cleaner from a marketing standpoint. The Basics of the Trailing Slash Traditionally, URLs that pointed to files did not include the trailing slash, while URLs that pointed to directories did include the trailing slash. is a directory, is a file. Leaving Off the Slash Results in a Redirect Don’t Include the Slash After Filenames

Étudier la stratégie SEO de votre concurrent en trois étapes Le SEO, de par sa nature où les résultats de recherche apparaîssent sous forme de classement, est une pratique dont les résultats seront directement influencés par les actions de vos concurrents. Si vous visez le premier rang pour un mot-clé compétitif, vos stratégies SEO doivent être plus créatives et générer des meilleurs résultats que celles de vos concurrents, par exemple au niveau du contenu ou des liens entrants. Cela implique que vous devez inévitablement vous tenir au courant des actions entreprises par vos concurrents, afin de non seulement identifier les raisons de leur succès, mais aussi savoir ce que vous devez faire pour les surpasser. Afin de vous aider à demeurer concurrentiel dans votre marché en ligne, cet article vous partagera des recommandations sur les points essentiels à analyser lors de l’étude d’un concurrent, principalement au niveau des items « on site », du profil des liens entrants et des contenus performants. Quels sont les endroits privilégiés? Conclusion

Schema.org Schemas for structured data Use canonical URLs - Search Console Help If you have a single page that's accessible by multiple URLs, or different pages with similar content (for example, a page with both a mobile and a desktop version), Google sees these as duplicate versions of the same page. Google will choose one URL as the canonical version and crawl that, and all other URLs will be considered duplicate URLs and crawled less often. If you don't explicitly tell Google which URL is canonical, Google will make the choice for you, or might consider them both of equal weight, which might lead to unwanted behavior, as explained in Reasons to choose a canonical URL. How Googlebot indexes and chooses the canonical URL When Googlebot indexes a site, it tries to determine the primary content of each page. Google chooses the canonical page based on a number of factors (or signals), such as whether the page is served via HTTP or HTTPS, page quality, presence of the URL in a sitemap, and any rel=canonical labeling. Valid reasons for keeping similar or duplicate pages

Les 20 outils indispensables pour réussir son référencement Analyse sémantique et technique, optimisation, netlinking… Pour mener à bien un projet d’optimisation d’un site pour le référencement naturel, il est nécessaire d’avoir recours à des outils pour être efficace. Découvrez une sélection de 20 outils pour vous épauler à chaque étape du référencement de votre site… Etude sémantique… 1 : Google générateur de mots-clés Nombre de recherche par mois, concurrence… Le générateur de mots-clés Google vous permettra d’obtenir des informations sur les mots clés que vous visez. 2 : Copyscape La duplication de contenu peut être préjudiciable pour votre référencement, Copyscape vous permet de vérifier que votre contenu n’a pas été repris, sans votre accord, sur un autre site.Lien : Copyscape. Analyse du référencement… 3 : Outils pour webmaster Google Google Webmaster vous permet d’analyser l’indexation de votre site et son état de santé (erreur d’exploration, url bloquée…). 4 : SEMVisu 5 : SEMRush SEMRush a plus ou moins les mêmes fonctionnalités que SEMvisu.

Related: