The engine Marketing SEM search (Search engine marketing) is the set of marketing activities to achieve certain objectives in the search engines (Search engine: search engines), both by improving the visibility of a website among the results of organic positioning, both the purchase of sponsored links or the mixture of both strategies or other campaign-related advertising and pay per click circuits. The SEM is a set of methodologies aimed at improving the ranking of a site in the pages of the search engines or, in any case to improve accessibility through search engines. The SEM (Search engine marketing) also refers to the profession and work undertaken by consultants SEM optimization projects have websites of their clients in order to improve the number of visits and outcomes or the planning of campaigns advertising based on keywords.

The Smallest Is The Best!.. As Long As It Serves Its Purpose

It is true! In graphics optimization, seen as a part of website optimization, the smallest is the best. Of course, the element in question should still serve its purpose of being the expression of an idea. Furthermore, it should be understandable, clear, suggestive and good-looking. Let's see together which could be the right choices (in terms of web graphics optimization) when we decide what type of graphics we'll use on a website.

What kinds of web graphics are present on the Internet?

When it comes to their origin, there are two types of graphic digital files: vector graphics (created with software tools like Corel DrawTM , FreehandTM etc.) and raster graphics (photographs, 3D renders and any other type of bitmap files).Most of the web sites on the World Wide Web hold as graphic elements bitmap-type graphics in three different formats: GIF (standing for Graphics Interchange Format), JPEG (Joint Photographic Experts Group) and PNG (Portable Network Graphics). All these 3 types use different compression algoritms to considerably reduce the size of the graphic file.Because PNG is not so popular, I hope that you will forgive me if I stick to GIF and JPEG in my article. Probably you already know a lot about these (or you can browse the web and find plenty of information) so I'll get directly to the choices we must make when it comes to designing our web site.

What type of bitmap files will hold our web graphics?

Some specialists say that we should always go with JPEG, because it supports 16 milion colors and produces quite small files. Well, this is not entirely true, I think. I can only tell you for sure that we don't have to go with the same type all over our website. How's that? Let's see!

When should we go with GIF?

- for graphics with fewer colors: web logos, cartoon-like drawings and line-art (pure black and white) drawings; whenever it is OK to use for our graphics the 256 colors pallette (or even the 216 "safe" colors pallette) - for grayscale pictures with less halftones (with big contrast) - for graphics with smaller screen size (even if with many colors) which rely on details; GIF format compression is lossless and keeps sharp contours and clear definition between areas filled with different colors; -when we need the "Transparency" option of GIF Format, e.g. when a graphic should have a non-rectangular shape and/or we want to discard its background - whenever an image saved in GIF format is smaller than one saved as JPEG, both images being at a comparable level of quality when displayed - generally, for vector-generated graphics (unless they have blending and/or gradient fills)

When to use JPEG format?

- for color images with 16 milion colors and many halftones (photographs, 3D render output files, any other images with continuous-tones) - for grayscale images which rely on subtle halftones - for graphics/images with big screen size where colors and shades are more important than contours, outlines and boundaries - whenever an image saved in JPEG format is smaller than one saved as GIF, both images being at a comparable level of quality when displayed - generally, for photographs and similar images

Which are the inconveniences of each format?

From my point of view, these are the main limitations for GIF and for JPEG formats:

For GIF: limited number of colors (it can show 16 milion colors, but only in dithered mode, which I do not recommend)

For JPEG: compression is done by reducing quality of the graphics (loss of sharpness, "hair filaments", "pixelate" areas etc.)

Whatever format you choose, when it comes to graphics optimization as a part of website optimization, the SMALLEST is the BEST! Beside the format choice, keep in mind some tips when you create and optimize your web graphics:

1. Minimize the screen size of your graphics to the point where it is still clear and suggestive.2. Try and try again saving a graphic in one of the two formats, at different quality levels (for JPEG) and different number of colors (for GIF). Do this until you find the best size / quality ratio that fits your needs.3. Use vector graphics software and limit the number of colors when you create the non-photographic graphics for your website.4. Put emphasis on shape, contour, silhouette and contrast when creating / processing your graphics.5. Choose carefully the resizing method in your image processor when you change the size / resolution of your graphics - "anti-alias" is not always the best choice.

Never put ads to new sites right away as Google may mark your site as “ ad farm “ and will not index the site. You should wait for a month or so to begin ad activity.

It has been seen that Google bestows high ranking to sites having multiple inbound links. The important fact is that such links must correspond to higher ranking site only.

You will not get any benefit from any sub standard link exchange sites and more over Google will side line your site for link farming. Your inbound links must be relevant to content of website.

You can link web site with good social book marking services like del.icio.us, reddit.com, dig.com to name a few but there are many other such sites which will serve your purpose but Most of social book marking sites use rel="nofollow" attribute which does not help increasing Page rank of website.

Google detects duplicate contents, copied content etc and never index sites which use them. It is always advisable to give relevant link and disclose the content source etc.

Always link single page with canonical URL. Always refrain from excess use of key words, over optimizing, irrelevant use of key word as this adversely affects website’s page ranking.

It has been very common practice to use invisible text and use it at the end of the main page but this is no longer a sensible thing to do as modern search engines detect it and may penalize your site. Always try to avoid query characters such as “?” as search engines never appreciate dynamic pages with such characters.

It is very essential to perform correct redirecting. When moving content from one page to another make sure to include 301 redirect codes, which indicates that page has been moved permanently.

Google suggests that there should be less than 100 links in a single page.

Always put links having equal or higher page rank sites and must be relevant to the content of the page.

Never use unauthorized softwares or programs to check rankings, submit page as you may be violating service terms etc.

Are article submissions worth the time? When doing manual submissions it takes me 10 to 15 minutes in average to login, format and submit an article to an article directory. I have to make at least 10 submissions to get a feasible exposure for my articles. So this process can take more than two hours a day! To make the most of my time I have to make sure that the article directories I am submitting to, are able to bring me as many visitors and backlinks as possible. I can name top five directories: EzineArticles.com, Buzzle.com, GoArticles.com, ArticlesFactory.com and WebProNews.com. Submitting to these is a must! EzineArticles.com and Buzzle.com can bring you a lot of traffic, GoArticles.com brings you backlinks, and ArticlesFactory.com â€“ PageRank (my profile page there now is PR5 â€“ with only 12 submitted articles).

But five top directories are not enough. As I said above, I usually do at least 10 manual submissions, so I have to choose my directories wisely. The only two ratings easily available to web users to judge the quality of websites and webpages are Alexa Rating and Google PageRank. Both are flawed and often do not reflect the real authority or traffic of a page. But since there is nothing better I have to stick to these two when deciding if a directory is worth submitting to. Of course, one must not forget the power of themed sites. Whenever I have a choice between a general directory and a directory focusing on my topic â€“ I choose the latter.

To make the choosing process easier I have obtained a list of article directories, their names and URLs. Yesterday, I downloaded Eclipse and wrote a simple Java application that queries Alexa Web Service for rating data, and makes PageRank lookups. I’ve run this application on my list this morning and here are the top results with Alexa rating < 100,000 sorted ascending:

The first basic truth you need to learn about SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by e piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified. Sometimes crawlers will not visit your site for a month or two, so during this time your SEO efforts will not be rewarded. But there is nothing you can do about it, so just keep quiet.

What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.