Welcome to Brenda Segna SEO Services and SEO Copywriting Blog

Thursday, August 27, 2009

SEO Services In A Tough Economy

I don't know of anyone who isn't feeling the pinch in these tough economic times. Now more then ever businesses need good SEO services and Internet marketing to drive traffic to their sites. There are several things that can be done that to increase your sites page rank and drive traffic to your site.

A good SEO Service Provider and Internet Marketing Company will review your site and customize a plan that meets your needs and your budget. SEO is not rocket science, but it is time consuming and needs to be done consistently in order to see results.

At BLaST creative we have learned from years of experience that when doing white hat seo and organic seo that it typically takes 3 months before you start to see results. And depending on the competition and what they are doing as far as SEO, it could take longer.

Some things that you can do to help increase traffic to your site as well as increase your page rank and search engine listing.

1. Meta tags-if your site doesn't have them, they need to be added, especially a good description and keywords.

2. Heading tags, alt tags-if you currently aren't utilizing these on your site then you need to update them.

3. Content-is it fresh and does it contain the keywords that you are trying to target? If not you need to update this as well.

4. Backlinks-Google just changed their algorithm again, you need to get relevant organic backlinks for your site.

If you own a business in these genres, financial, photographers, dentists, hotels, restaurants, construction, HVAC, hardware stores, real estate or any business, and are interested in more information on SEO Services and Internet Marketing, please contact BLaST creative today!

Friday, May 1, 2009

Detecting and Fixing Site Visibility Issues

In terms of the Integrated Approach, site visibility refers to compliance of a page's source code with the requirements and needs of search engine spiders. This topic is based on the knowledge that the spiders are computer programs instructed to parse your HTML pages, then analyze certain page areas and undertake certain actions such as requesting other pages by following the hyperlinks found on the given page.

Thus, we can make the conclusion that specific page areas bear a certain importance for spiders and thus play a role in overall search engine visibility. Another conclusion is that consistent site structure is an essential factor for spiders to be able to effectively crawl your site.

Following are the main visibility problems that are usually encountered on a website (especially large sites):

• Broken links;
• Broken redirects;
• Missing ALT attributes;
• Missing TITLE and META (keywords and description) tags
• Old pages which are not regularly updated
• Deep pages (located far from the home page or from the page where the spider enters your site)
• Excessive length of TITLE and META tags

Most of these problems (except for broken links and redirects) will not frustrate your visitors. These flaws, however, are potentially dangerous for your positions in the SERP (search engine result pages) and for indexing of your whole site by search engine spiders.

Let's get into a deeper explanation for each of these problems.

Broken links
Broken links are a common scourge for webmasters, especially with large and dynamic sites. A search engine robot will not be able to access a page that is hidden behind a broken link. Most robots, however, will not stop spidering your site when they meet a broken internal link, provided they have some other links to follow. But if the broken link is intended to point to a strategically important content page, the page won't be indexed and the problem then becomes more serious.

Broken external links are not as critical from the visibility point of view; rather they state problems for the visibility of the site or page they're pointing to.

Redirects and broken redirects
Most common redirect codes are 301 (Moved Permanently), 302 (Found), 304 (Not Modified), 305 (Use Proxy) and 307 (Temporary Redirect).

As a webmaster, you may implement a redirect in a number of different ways: either by sending the corresponding HTTP headers from within your server-side code (PHP, ASP, PERL etc), or by putting the instructions into the ".htaccess" file in the directory on the server, or by putting a special META REFRESH tag directly into the HTML code of the page:

meta http-equiv="REFRESH" content="0;url=http:www.mywebsite.com/new-file.htm
In this case, the user-agent (either a visitor's browser or a search engine spider) will treat the tag as if it has met a 301 (Moved Permanently) redirect.

Actually, from all the redirects enumerated above, only one (301) is considered search engine-friendly and is recommended by the search engines. The others have been abused by spammers in the past (and are still often used in black-hat SEO) for creating doorway pages for search engines and for cloaking a website. Thus, search engines apply strict policies to pages for which they get all the other redirect response codes from the server (300,302 and above).

Moreover, this redirect abuse led some search engines to ban sites that use META refresh tags for redirecting the visitor. AltaVista is the strictest: sites with the refresh attribute set to less than 30 seconds have been banned as spam. The policy of other search engines varies. In the past, Google hasn't been too worried about spam since it relies mainly on link popularity to rank sites - but they have recently started to pay more attention to these issues.

Because of this, we recommend that you use a server-side redirect with the help of a dynamic technology like PHP or ASP. If this isn't possible, use the ".htaccess" file (on apache servers) with the following line:

Redirect 301 /page1.htm http://www.yoursite.com/page2.htm

to redirect the visitors from page1.htm to page2.htm.

Broken redirects will lead to the same results as the broken links, that is, spiders will not be able to index the pages hidden behind the broken redirect, unless they find that page through some other links or redirects used on your website.

Missing ALT attributes
ALT attributes are used within the "< IMG >" HTML tags to describe images textually if the browser will not display the image. The ALT attribute is also useful in providing additional information when a user hovers the mouse cursor over a displayed image.

Nowadays ALT attributes are not as important for search visibility as they used to be a couple of years ago. Therefore, there won't be any harm for your rankings if you omit them. However, it's useful to include ALT attributes, especially to get your images ranked for particular keywords in the image search, which also brings you organic traffic.

Missing TITLE and META tags
This is probably the most serious gap in your readiness for the SEO campaign. The TITLE tag, META keywords and META description tags are the primary regions where search engines attempt to find relevant keywords for your site. If some of your pages are missing these TITLE and META tags, you should put these on your page before starting any further optimization analysis.

Old pages
One of the algorithms Google now applies for ranking Web pages is the SandBox algorithm. It means that the pages that haven't been updated for a long time will gradually lose their positions in the rankings even if they initially had high positions. Google constantly crawls the Web and adds thousands of new pages to its index. These new pages have unique content, but it is the novelty or “newness” of these pages that may cause Google to rank them higher than the actual content would otherwise dictate. Google assumes that newer pages probably contain more updated information than older ones. This process will gradually "dissolve" your site in the competition, while your pages get "crowded out" by the new additions from the top results.

To retain your high positions, you not only need to constantly add to your link popularity, but also update your pages often and add new content pages from time to time.

Deep Pages
An average search engine spider will not crawl your site deeper than 2 levels away from the home page and the most advanced spiders are instructed not to go further than 4 levels deep. Under "depth" we mean number of jumps which should be made from page to page along the links to get to the destination page from the source page (which is, most often, your home page). So, if a spider enters your site on your home page, it will read all pages linked to it, then all pages those pages link to and then most spiders will stop. If your site has any pages that link any deeper than this your site structure needs optimization.

It's important to understand that the link structure and file structure are two different things: you may keep all your 500 pages in the root directory, yet some of them may be 5 and more levels deep according to the link jumps.

TITLE and META tags length
The information placed in the TITLE, "keywords" and "description" META tags is indexed by search engine spiders for use on the search results page. There is a limited amount of space for these. If they are too long, they will be cut off in the search results display. Moreover, search engines will not read more than certain amount of characters in the tag. Even further, some search engine spiders may consider an excessive amount of characters as spamming. Therefore, it's useful to check whether the length of these vital page areas is within sensible limits.

For information Search Engine Optimization, Internet Marketing and Web Design, please contact BLaST creative today!