Content

8

Title Tag

High impactEasy to solve

FUSAI

Length: 5 character(s) (48 pixels)

HTML title tags appear in browser tabs, bookmarks and in search results.

It looks like your title tag is a little outside the ideal length. Since they are one of the most important on-page SEO elements you should make your title tags between 20 and 70 characters including spaces (200 - 569 pixels). Make sure each page has a unique title and use your most important keywords. For internal pages start your title tags with your most important keyword(s).

Meta descriptions allow you to influence how your web pages are described and displayed in search results. A good description acts as a potential organic advertisement and encourages the viewer to click through to your site.

Keep it short and to the point; the ideal meta description should contain between 70 and 160 characters - spaces included (600 - 940 pixels).

Ensure that each of your web pages have a unique, straightforward meta description that contains most important keywords. These keywords are especially important because they appear in bold when they match the user’s search query (See the Google Preview below).

Check your Google Search Console (Search Appearance > HTML Improvements) for any warning messages to identify meta descriptions that are too long/short or duplicated across more than one page.

This is a representation of what your title tag and meta description will look like in Google search results for both mobile and desktop users. Searchers on mobile devices will also see your site's favicon displayed next to the page's URL or domain.

Search engines may create their own titles and descriptions if they are missing, poorly written and/or not relevant to the content on the page and cut short if they go over the character limit. So it’s important to be clear, concise and within the suggested character limit.

Check your title tag and meta description to make sure they are clear, concise, within the suggested character limit and that they convey the right message to encourage the viewer to click through to your site.

Use your keywords in the headings and make sure the first level (<H1>) includes your most important keywords. Never duplicate your title tag content in your header tag.

While it is important to ensure every page has an <H1> tag, only include more than one per page if you're using HTML5. Instead, use multiple <H2> - <H6> tags.

Add the important keywords in <H> headings

Content Analysis

sin fines de lucro
3

departamento de san vicente
2

desarrollo de empresas
2

centro de capacitación
2

memorias de labores
2

integral el salvador
2

apoyo integral inversiones
2

inclusión social
2

programa solidario comunitario
2

fondo pro hábitat
2

el salvador
3

con
8

integral guatemala
2

nuestro boletín
2

somos propietarias
2

efectivas
4

hábitat popular
2

empresas
6

fusai
5

misión
2

This data represents the words and phrases that your page appears to be optimized around. We use what’s called “natural language processing" (NLP), which is a form of artificial intelligence that allows computers to read human language, to do this analysis.

The numbers next to each word or phrase represents how often we detected them and their variants on the page.

Are these the keywords you want to target for your page? If so, great! Track your site’s rankings in Google search results using WooRank’s Keyword Tool.

A page's link juice is split between all the links on that page so lots of unnecessary links on a page will dilute the value attributed to each link. There's no exact number of links to include on a page but best practice is to keep it under 200.

Using the Nofollow attribute in your links prevents some link juice, but these links are still taken into account when calculating the value that is passed through each link, so using lots of NoFollow links can still dilute PageRank.

Robots.txt

A robots.txt file allows you to restrict the access of search engine crawlers to prevent them from accessing specific pages or directories. They also point the web crawler to your page’s XML sitemap file.

Your site currently has a robots.txt file. You can use Google Search Console's Robots.txt Tester to submit and test your robots.txt file and to make sure Googlebot isn't crawling any restricted files.

See the pages you've disallowed with your robots.txt file with Site Crawl.

Add a robots.txt file

XML Sitemap

XML sitemaps contain the list of your URLs that are available to index and allow the search engines to read your pages more intelligently. They can also include information like your site’s latest updates, frequency of changes and the importance of URLs.

Be sure to only include the pages you want search engines to crawl, so leave out any that have been blocked in a robots.txt file. Avoid using any URLs that cause redirects or error codes and be sure to be consistent in using your preferred URLs (with or without www.), correct protocols (http vs. https) and trailing slashes. You should also use your robots.txt file to point search engine crawlers to the location of your sitemap.

URL Parameters

Perfect, your URLs look clean.

URL parameters are used to track user behaviors on site (session IDs), traffic sources (referrer IDs) or to give users control over the content on the page (sorting and filtering). The issue with URL parameters is that Google sees each unique parameter value as a new URL hosting the same thing - meaning you could have a duplicate content problem. Sometimes, it’s able to recognize these URLs and group them together. It then algorithmically decides which URL is the best representation of the group and uses it to consolidate ranking signals and display in search results. You can help Google recognize the best URL by using the rel="canonical" tag.

Use the URL Parameters Tool in Google Search Console to tell Google how your URL parameters affect page content and how to to crawl URLs with parameters. Use this tool very carefully - you can easily prevent Google from crawling pages you want indexed through overly restrictive crawling settings, especially if you have URLs with multiple parameters.

Check the On-Page section of Site Crawl to identify any duplicate content issues.

Rewrite your URLs and clean them up.

Hreflang Tags

No hreflang tags were found on this page

The hreflang tag is an HTML tag that tells search engines which languages and (optionally) countries a page's content is relevant for. Hreflang tags also tell search engines where to find the relevant content in alternate languages.

If your website targets users all around the world, using hreflang tags will help make sure the right content is being served to the right users.

The value of the hreflang attribute identifies the language (in ISO 639-1 format) and optionally a region in ISO 3166-1 Alpha 2 format of an alternate URL.

Underscores in the URLs

Low impactHard to solve

Great, you are not using ‪underscores (these_are_underscores) in your URLs.

Great, you aren't using ‪underscores (these_are_underscores) in your URLs.

Google sees hyphens as word separators while underscores aren't recognized. So the search engine sees www.example.com/green_dress as www.example.com/greendress. The bots will have a hard time determining this URL's relevance to a keyword.

Mobile

Mobile Friendliness

High impactHard to solve

Very Good Very Good

This web page is super optimized for Mobile Visitors

Mobile friendly pages make it easy for users to complete objectives and common tasks and use a design or template that is consistent across all devices (uses responsive web design).

Your site is well configured for mobile users.

Mobile Rendering

This is how your website appears when displayed on different mobile devices.

With more than half of all Google search queries originating on a mobile device, it is important to make sure your mobile site is optimized for these users.

Structured Data

Schema.org

Schema.org is a set of vocabularies used to add meaning to the information on a webpage in a way that is readable by machines (Google). Schema.org vocabularies include attributes for entities, relationships between entities and actions.

Open Graph Protocol

Medium impactEasy to solve

We didn't detect any Open Graph tags on your webpage

Facebook developed the Open Graph protocol to enable the integration of any web page with its social media platform. Other social media platforms have also adopted the protocol, allowing you to control how your web pages are presented when shared across social media.

Microformats

We didn't detect any microformat items on your webpage

Designed for humans first and machines second, microformats use code (HTML/XHTML tags) originally intended for other purposes to add context to the content on a webpage. This helps machines (like Google!) to understand certain information (contact information, geographic coordinates, calendar events, etc.) intended for humans.

Security

Email Privacy

Warning! At least one email address has been found in plain text.

We don’t recommend adding plain text/linked email addresses to your webpages, as malicious bots scrape the web in search of email addresses to spam. Instead, consider using a contact form.

Modern websites tend to be SSL secured (HTTPS) as it provides an extra security layer while logging in to your Web Service. In 2014, Google announced that an HTTPS (vs HTTP) website would receive an extra boost in their ranking.

While switching to HTTPS, make sure your site remains optimized and see to it that your website will still run quickly. Follow these best practices for a smooth transition:

Fast websites make happy visitors. Enabling minification on assets like HTML, JavaScript and CSS files will reduce their transfer size. Every time a page is requested from your website less bytes and lighter assets are sent over the network resulting in faster delivery which loads your website faster for your customers.

Asset Compression

Medium impactHard to solve

Perfect, all your assets are compressed.

Great! We didn't find uncompressed assets on your web page.Compressing assets reduces the amount of time it takes a user's browser to download files from your server. Enabling compression is an important part of reducing the amount of time it takes your website to load.

Fast websites make happy visitors. Caching assets such as images, javascript and CSS files allows a browser to keep these files in local storage so it doesn't have to download them every time it requests a page on your website. This will lower the bandwidth used and improve the page load time.

Technologies

Server Uptime

Server IP

192.185.194.87

Server location: Houston

Your server's IP address has little impact on your SEO. Nevertheless, try to host your website on a server which is geographically close to your visitors. Search engines take the geolocation of a server into account as well as the server speed.

Branding

URL

A descriptive URL is better recognized by search engines. A user should be able to look at the address bar and make an accurate guess about the content of the page before reaching it (e.g., http://www.mysite.com/en/products).

Resource: Search for a good domain name. If no good names are available, consider a second hand domain. To prevent brand theft, you might consider trademarking your domain name.

Favicon

Great, your website has a favicon.

Favicons are the small icons that represent a website, company and/or brand. They can be displayed (among other instances) in browser tabs or bookmarks. Google also displays favicons in a page's search snippet in mobile search results.

You can see how your site's favicon appears in search results in the Google Preview above.

Twitter Account

We found a Twitter profile for your brand, but it's not linked to fusai.org.sv. Linking your Twitter account to your website helps prevent brandjacking and can help make your social media marketing more effective. Here are a few tips to help create a Twitter promotion plan. Use Twitter Dashboard and Analytics to track and optimize your Twitter feed.