Saturday, March 31, 2012

Google Panda Updates and How to Survive with Trend Content


I know that some of us are doing all their best to now on how to survive when Google give us their new Panda Updates. I read some articles and find out this article. I know that when you read this article it can also help you to give some ideas on how to survive in Google Panda Updates.

First let us know what Google Panda Update is...

The Google Panda Update is an update to its algorithms released by Google, the world's largest search engine. The Google Panda Update is designed to demote sites that scrape or imitate content from other sites around the internet, a reminder of just how forceful Google can be when it comes to dealing with duplicate content. While this has boosted the ranking of some of the sites like YouTube or Vimeo, it has had a negative effect on many sites.

Then lets proceed if what are the sites that have been affected by the Google Panda Updates.

The sites that appear to have been hit hardest are those that are known as "content farms". Content farms are defined by Wikipedia as a site that hires large number of freelance and professional writers to create content that is designed to provide maximum retrieval of the produced text by search engines, thanks to the fact that it is written to satisfy the regulations of the Google algorithms.

There are many websites that use content that is simply spam, but that contain high amounts of keywords and keyword phrases. The Google Panda Update is designed to demote those sites and drop their rankings through the floor. Now, instead of focusing exclusively on the keywords used on these sites, the Google Panda Update is causing the search engine to look through the content to find text that is actually relevant and accurate.

The main sites that have been hit by this update have been:

Sites such as Demand Media, Suite 101, Answers.com, and many other sites that use a large amount of text with keywords that is simply content to improve rankings of individual websites.

A number of technology sites that provide similar reviews on the same product have also taken a serious hit, with one of the major tech review sites in the UK not even making it onto the first two pages of a Google search results list
Many sites that offer discount coupons or special shopping offers have also been hit hard, thanks to the fact that many of the pages on the websites appear to contain duplicate and spam content.

Many websites that display sales in various locations have also taken a hit.

What sites does the Google Panda Update target?

The Google Panda Update has been known to target sites that have:

A high amount of duplicate content, or content that is considered to be duplicate.

A low amount of original content, or content that is unique to the website.
Many advertisements that appear to be inappropriate or irrelevant to the topic of the website.

Many pages that have little original content.

Page title tag and content that don't match the search queries for which the page has high performance.

Over-optimizing web pages, or using text that has too many keywords, regardless of whether they appear to be natural or not.
Pages that have low visit times

Pages that have high bounce rates

Poor rate of returning visitors

Few links leading to the page

Boilerplate or identical content on every page

Few links leading to the page from social media sites

If your site has any of the above characteristics, then you may want to take action to correct the website to ensure that your site isn't discarded or demoted by the Google Panda Update.

What can I do to change my site?

In order to reduce the effect that this Google Panda Update will have on your website, it is vital that you make some changes to your website. The following changes may be simple, but they are essential to ensure that your website doesn't take a hit from the Google Panda Update:

- Get rid of excess advertisements on your website. Pages that have too much advertising are usually demoted by the update, so get rid of any advertising that is excessive or irrelevant.

- Make sure that your pages contain no duplicate content from any other website. Ensure that all of the content posted on your page is completed unique.

- Find if any of your website's individual pages are getting more results from a keyword that is not the actual keyword of that page. Google is chastising pages that get these results, so make the needed changes to correct this problem.

- Always ensure that the headlines of your pages actually match the text and the content of those pages. The pages that have this discrepancy are those that are being hit hard by this latest update, so take the time to correct your web page.

- Make sure that any content that you post on your website is actually useful and authoritative. The more value your content provides those who read it, the higher your Google ranking will be with the Google Panda Update.

- Try adding a few more words to each page, as search engines will improve the ranking of pages that have a high word content.

- Make sure that your website is highly visible in social media sites. Get people to post comments on their social media sites, and make sure to share your website with as many people as possible to ensure that it doesn't get demoted.

- Try to build as many links to your page as possible, using every method you can. Don't only focus on social media, but work on getting other websites, forums, and article directories to link back to your website.


Wednesday, March 21, 2012

Link Building Defination


Link building is one of the important way in off page optimization. Now I want to discuss what is Link building.

Short definition of link building: it is a process of establishing relevant links to your website which can help your site to achieve higher ranking in search engines and drives targeted traffic to your site.

When our clients asking about link building; what is link building we explain it carefully and let them to understand that it is not only the quantity of inbound links to your site helps you to rank well but rather the quality of those links.

Today there are some link builders that instead of helping your site to be on a higher rank they put your site in a risk or penalize because of unethical way of link building.
Here are some of unethical methods in link building:
  • Spamming all sorts of web forums with bogus responses that include links to your site.
  • Automated spam responses to blog posts.
  • Creating websites solely for the purpose of linking to other sites. These are sometimes referred to as “spam directories.”
Search engine now are more sophisticated they sniff out all the black hat SEO practices. If the search engine crawls and determine that the site is doing an unethical link building practices, they will automatically penalize your site severely. It will bury your site making ti almost impossible to fine though the search engines.

The only way to achieve better ranking with the major search engines is the permanent and relevant links from established sites that engaged in ethical link building practices. This can also drive targeted traffic to your site.

Link building is a key facet of any successful Internet marketing strategy because if your site features relevant, factual information that is written well, the administrators of other websites will want to link to your site because it offers their readers something of interest and value.

There are no shortcuts in a successful link building campaign and even when it is done properly, you should not expect to reap the benefits of link building overnight. Patience and persistence are the keys because it can take time to build links from credible sources. 

While link building is a key element of an Internet marketing strategy, link building alone will not help your site achieve its natural ranking with the search engines. Only if link building is incorporated in a comprehensive search engine optimization approach will your site gain the full benefits from links from other sites and achieve its potential.

Monday, March 19, 2012

+ White Hat SEO +

To improve a Web page's position in a SERP, you have to know how search engines work. Search engines categorize Web pages based on keywords -- important terms that are relevant to the content of the page. In our example, the term "skydiving" should be a keyword, but a term like "bungee jumping" wouldn't be relevant.

Most search engines use computer programs called spiders or crawlers to search the Web and analyze individual pages. These programs read Web pages and index them according to the terms that show up often and in important sections of the page. There's no way for a search engine spider to know your page is about skydiving unless you use the right keywords in the right places.
Here are some general tips about keyword placement:
  • One place you should definitely include keywords is in the title of your Web page. You might want to choose something like "Skydiving 101" or "The Art of Skydiving."
  • Another good place to use keywords is in headers. If your page has several sections, consider using header tags and include important keywords in them. In our example, headers might include "Skydiving Equipment" or "Skydiving Classes."
  • Most SEO experts recommend that you use important keywords throughout the Web page, particularly at the top, but it's possible to overuse keywords. Your skydiving site would obviously use the word "skydiving" as a keyword, but it might also include other keywords like "base jumping" or "parachute." If you use a keyword too many times, some search engine spiders will flag your page as spam. That's because of a black hat technique called keyword stuffing, but more on that later.
Keywords aren't the only important factor search engines take into account when generating SERPs. Just because a site uses keywords well doesn't mean it's one of the best resources on the Web. To determine the quality of a Web page, most automated search engines use link analysis. Link analysis means the search engine looks to see how many other Web pages link to the page in question.

Going back to our skydiving example, if a search engine sees that hundreds of other Web pages related to skydiving are linking to your Web page, the engine will give your page a higher rank. Search engines like Google weigh the importance of links based on the rank of the linking pages. In other words, if the pages linking to your site are themselves ranked high in Google's system, they boost your page's rank more than lesser-ranked pages.

So, how do you get sites to link to your page? That's a tricky task, but make sure your page is a destination people want to link to, and you're halfway there. Another way is to offer link exchanges with other sites that cover material related to your content. You don't want to trade links with just anyone because many search engines look to see how relevant the links to and from your page are to the information within your page. Too many irrelevant links and the search engine will think you're trying to cheat the system.
In the next section, we'll look more closely at ways people try to fool search engines into ranking their pages higher on a SERP.

source page:  http://computer.howstuffworks.com/search-engine-optimization2.htm

Thursday, March 15, 2012

+ Black Hat SEO +

Some people seem to believe that on the Web, the ends justify the means. There are lots of ways webmasters can try to trick search engines into listing their Web pages high in SERPs, though such a victory doesn't usually last very long.

One of these methods is called keyword stuffing, which skews search engine results by overusing keywords on the page. Usually webmasters will put repeated keywords toward the bottom of the page where most visitors won't see them. They can also use invisible text, text with a color matching the page's background. Since search engine spiders read content through the page's HTML code, they detect text even if people can't see it. Some search engine spiders can identify and ignore text that matches the page's background color.

Webmasters might include irrelevant keywords to trick search engines. The webmasters look to see which search terms are the most popular and then use those words on their Web pages. While search engines might index the page under more keywords, people who follow the SERP links often leave the site once they realize it has little or nothing to do with their search terms.

A webmaster might create Web pages that redirect visitors to another page. The webmaster creates a simple page that includes certain keywords to get listed on a SERP. The page also includes a program that redirects visitors to a different page that often has nothing to do with the original search term. With several pages that each focus on a current hot topic, the webmaster can get a lot of traffic to a particular Web site.

¬Page stuffing also cheats people out of a fair search engine experience. Webmasters first create a Web page that appears high up on a SERP. Then, the webmaster duplicates the page in the hopes that both pages will make the top results. The webmaster does this repeatedly with the intent to push other results off the top of the SERP and eliminate the competition. Most search engine spiders are able to compare pages against each other and determine if two different pages have the same content.

Selling and farming links are popular black hat SEO techniques. Because many search engines look at links to determine a Web page's relevancy, some webmasters buy links from other sites to boost a page's rank. A link farm is a collection of Web pages that all interlink with one another in order to increase each page's rank. Small link farms seem pretty harmless, but some link farms include hundreds of Web sites, each with a Web page dedicated just to listing links to every other site in the farm. When search engines detect a link selling scheme or link farm, they flag every site involved. Sometimes the search engine will simply demote every page's rank. In other cases, it might ban all the sites from its indexes.

Cheating the system might result in a temporary increase in visitors, but since people normally don't like to be fooled, the benefits are questionable at best. Who wants to return to a site that isn't what it claims to be? Plus, most search engines penalize Web pages that use black hat techniques, which means the webmaster trades a short success for a long-term failure.

In the next section, we'll look at some factors that make SEO more difficult.


Black hat seo source: http://computer.howstuffworks.com/search-engine-optimization3.htm

Wednesday, March 14, 2012

- Blog Commenting -


Blog Commenting is considered as one of the easiest ways to get free one way links to your website. All that is required of you is simply leave a comment on site related blogs and include some anchor text linking back to your site. 

There are many successful blogs that discuss a wide variety of different things but when you want to create a Marketing Ring of websites that link to each other you will far better to stick to one topic. The blogosphere is a very big world, and there’s enough space for you to participate in. First, you can comment on other people’s blogs. Oftentimes, blogging platforms allow you to link your name to your website. This way other blog owners and visitors can visit you. Blog commenting can also help them get a general idea about you.

Blogging is significant because of its power to communicate and socialize ideas quickly. Blogs are citizen media tools that represent your markets. These markets are forming communities of discourse around products, services and markets. Your markets content real-time conversations through active content, ideas and opinions and opinions are viral and consumers are exercising control through choice and voice.

They are choosing what messages to receive and the voicing their ideas and opinions using blogs. Blogs accelerate the speed of communication; provide authoritative and quality content, and foundation for highly influential online social markets. The blogging benefits for businesses is increase and accelerated exposure for your message, the opportunity to drive reputation and greater value into your niche markets much quicker and the ability to direct traffic and clicks to the messages that mean the most to your business.
ADVANTAGES
»It is a real good strategy to get quality back links to the website.
»Most effective strategy for an immediate raise in the traffic to your website
»It will increase your Google position within a short period for high competitive keywords too.


source: http://www.therealtraffic.com/blogcommenting.htm

Monday, March 12, 2012

20 Basic SEO Terms You Should Know


1. SEM: Stands for Search Engine Marketing, and as the name implies it involves marketing services or products via search engines. SEM is divided into two main pillars: SEO and PPC. SEO stands for Search Engine Optimization, and it is the practice of optimizing websites to make their pages appear in the organic search results. PPC stands for Pay-Per-Click, and it is the practice of purchasing clicks from search engines. The clicks come from sponsored listings in the search results.

2. Backlink: Also called inlink or simply link, it is an hyperlink on another website pointing back to your own website. Backlinks are important for SEO because they affect directly the PageRank of any web page, influencing its search rankings.

3. PageRank: PageRank is an algorithm that Google uses to estimate the relative important of pages around the web. The basic idea behind the algorithm is the fact that a link from page A to page B can be seen as a vote of trust from page A to page B. The higher the number of links (weighted to their value) to a page, therefore, the higher the probability that such page is important.

4. Linkbait: A linkbait is a piece of web content published on a website or blog with the goal of attracting as many backlinks as possible (in order to improve one’s search rankings). Usually it’s a written piece, but it can also be a video, a picture, a quiz or anything else. A classic example of linkbait are the “Top 10″ lists that tend to become popular on social bookmarking sites.

5. Link farm: A link farm is a group of websites where every website links to every other website, with the purpose of artificially increasing the PageRank of all the sites in the farm. This practice was effective in the early days of search engines, but today they are seeing as a spamming technique (and thus can get you penalized).

6. Anchor text: The anchor text of a backlink is the text that is clickable on the web page. Having keyword rich anchor texts help with SEO because Google will associate these keywords with the content of your website. If you have a weight loss blog, for instance, it would help your search rankings if some of your backlinks had “weight loss” as their anchor texts.

7. NoFollow: The nofollow is a link attribute used by website owners to signal to Google that they don’t endorse the website they are linking to. This can happen either when the link is created by the users themselves (e.g., blog comments), or when the link was paid for (e.g., sponsors and advertisers). When Google sees the nofollow attribute it will basically not count that link for the PageRank and search algorithms.

8. Link Sculpting: By using the nofollow attribute strategically webmasters were able to channel the flow of PageRank within their websites, thus increasing the search rankings of desired pages. This practice is no longer effective as Google recently change how it handles the nofollow attribute.

9. Title Tag: The title tag is literally the title of a web page, and it’s one of the most important factors inside Google’s search algorithm. Ideally your title tag should be unique and contain the main keywords of your page. You can see the title tag of any web page on top of the browser while navigating it.

10. Meta Tags: Like the title tag, meta tags are used to give search engines more information regarding the content of your pages. The meta tags are placed inside the HEAD section of your HTML code, and thus are not visible to human visitors.

11. Search Algorithm: Google’s search algorithm is used to find the most relevant web pages for any search query. The algorithm considers over 200 factors (according to Google itself), including the PageRank value, the title tag, the meta tags, the content of the website, the age of the domain and so on.

12. SERP: Stands for Search Engine Results Page. It’s basically the page you’ll get when you search for a specific keyword on Google or on other search engines. The amount of search traffic your website will receive depends on the rankings it will have inside the SERPs.

13. Sandbox: Google basically has a separate index, the sandbox, where it places all newly discovered websites. When websites are on the sandbox, they won’t appear in the search results for normal search queries. Once Google verifies that the website is legitimate, it will move it out of the sandbox and into the main index.

14. Keyword Density: To find the keyword density of any particular page you just need to divide the number of times that keyword is used by the total number of words in the page. Keyword density used to be an important SEO factor, as the early algorithms placed a heavy emphasis on it. This is not the case anymore.

15. Keyword Stuffing: Since keyword density was an important factor on the early search algorithms, webmasters started to game the system by artificially inflating the keyword density inside their websites. This is called keyword stuffing. These days this practice won’t help you, and it can also get you penalized.

16. Cloaking: This technique involves making the same web page show different content to search engines and to human visitors. The purpose is to get the page ranked for specific keywords, and then use the incoming traffic to promote unrelated products or services. This practice is considering spamming and can get you penalized (if not banned) on most search engines.

17. Web Crawler: Also called search bot or spider, it’s a computer program that browses the web on behalf of search engines, trying to discover new links and new pages. This is the first step on the indexation process.

18. Duplicate Content: Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. You should avoid having duplicate content on your website because it can get you penalized.

19. Canonical URL: Canonicalization is a process for converting data that has more than one possible representation into a “standard” canonical representation. A canonical URL, therefore, is the standard URL for accessing a specific page within your website. For instance, the canonical version of your domain might be http://www.domain.com instead of http://domain.com.

20. Robots.txt: This is nothing more than a file, placed in the root of the domain, that is used to inform search bots about the structure of the website. For instance, via the robots.txt file it’s possible to block specific search robots and to restrict the access to specific folders of section inside the website.

read more:  http://www.dailyblogtips.com/20-seo-terms-you-should-know/

10 Basic SEO tips everyone should know

"SEO" means finding ways to increase your site's appearance in web visitors' search results. This generally means more traffic to your site.

While intense SEO can involve complex site restructuring with a firm (or consultant) that specializes in this area, there are a few simple steps you can take yourself to increase your search engine ranking.

All it requires is a little effort, and some re-thinking of how you approach content on your site.

10 Basic SEO Tips To Get You Started:

1. Monitor where you stand
-- You won't know if your SEO efforts are working unless you monitor your search standings. MarketingVox suggests that you keep an eye on your page rank with tools like Alexa and the Google toolbar.

-- It's also important to check your referrer log regularly to track where your visitors are coming from and the search terms they're using to find your site

2. Keywords
-- You should be conscious of placing appropriate keywords throughout every aspect of your site: your titles, content, URLs, and image names. Think about your keywords as search terms -- how would someone looking for information on this topic search for it?

BEWARE: Putting ridiculous amounts of keywords on your site will get you labeled as a spammer, and search engine spiders are programmed to ignore sites guilty of "keyword-stuffing." Be strategic in your keyword use.

3. Link back to yourself
-- There is probably no more basic strategy for SEO than the integration of internal links into your site -- it is an easy way to boost traffic to individual pages
-- You should make it standard to link back to your archives frequently when creating new content. "The more relevant words point to a page, the more likely that page is to appear in search results when users run a query with those terms."

As with all other SEO approaches, be sure your links are appropriate, and be careful not to cross the line into excessive linking -- you don't want your visitors to get annoyed.

4. Create a sitemap
-- Adding a site map -- a page listing and linking to all the other major pages on your site -- makes it easier for spiders to search your site.

"The fewer clicks necessary to get to a page on your website, the better," advises MarketingVox

5. Search-friendly URLs
-- Make your URLs more search-engine-friendly by naming them with clear keywords.

SEO Consult explains: "For instance, it’s easy to understand what  ‘www.your_friendly_url.com’ would contain.It’s less easy to understand if the in-house classification system of the business is used.

6. Avoid Flash
-- Flash might look pretty, but it does nothing for your SEO. According to the Search Engine Journal, "Frames, Flash and AJAX all share a common problem – you can’t link to a single page... Don’t use Frames at all and use Flash and AJAX sparingly for best SEO results."

"If you absolutely MUST have your main page as a splash page that is all Flash or one big image, place text and navigation links below the fold," the post continues.

7. Image descriptions
-- Spiders can only search text, not text in your images -- which is why you need to make the words associated with your images as descriptive as possible.

Start with your image names: adding an "ALT" tag allows you to include a keyword-rich description for every image on your site.

The visible text around your images is valuable for SEO: MarketPosition suggests adding captions to all your pictures and being descriptive with the text in immediate physical proximity to your images.

8. Content
-- Your content needs to be fresh -- updating regularly and often is crucial for increasing traffic.

"The best sites for users, and consequently for search engines, are full of oft-updated, useful information about a given service, product, topic or discipline,"

9. Social media distribution
-- An important SEO strategy according to SEO Consult. You should be distributing links to fresh content on your site across appropriate social networking platforms.

Whether displayed on your company's account, or recommended, re-tweeted, and re-distributed by someone else, this strategy exponentially muliplies the number of places where visitors will view your links.

10. Link to others
-- An easy way to direct more traffic to your site is by developing relationships with other sites.

PC World suggests that you personally ask the webmasters of well-respected sites if they'll include a link to your site on theirs. Be sure to return the favor -- then everyone wins!

Make certain that your partner has a good web-reputation, of course. MarketingVox warns against getting tied to a "link farm" whose bad SEO habits could bring you down.


Read more: http://www.businessinsider.com/10-basic-seo-tips-everyone-should-know-2010-1



Friday, March 9, 2012

Panda 3.3 Update


Here’s what Google says about its latest Panda-related change:

Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.
This sounds very similar to Panda 3.2, which happened in mid-January and was described only as a “data refresh” and not related to new or changed ranking signals.
Panda was launched a year ago — don’t miss the full background in our recent story,Infographic: The Google Panda Update, One Year Later.
Postscript, Feb. 28th: Google tells us that the Panda data update took place yesterday, February 27th. The company declined to share any additional information about the “link evaluation” item below.

Evaluating Links

Google says it’s getting rid of a link evaluation signal that it’s been using for years. This one’s sure to prompt discussion:
Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.
We’ve reached out to Google in the past, asking for further clarification on the items in these monthly roundups. The company has indicated that the blog post says everything Google wants to say. That, along with Google’s understandable (and necessary) reluctance to give away too many details about ranking signals, leads me to assume we won’t be getting anything more than the above about this.
A link evaluation signal that’s been used for years is now turned off? The SEO mind races….

Local Search Rankings

Here’s another one, along with the link evaluation signal, that I’m actually surprised Google would so openly reveal. The company says traditional algorithmic ranking factors are now playing a bigger part in triggering local search results:
Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.
Traditional SEO has played a bigger part in Google’s local search since the launch of Places Search in late 2010. And now it sounds like that dial is being turned up a little higher, too.
Google’s post also says local results are being improved because of a “new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user.”

Other Google Updates

As I said, it’s impossible to recap the entire Google blog post. In addition to the items above, you might pay attention to these items:
  • More accurate detection of official pages
  • Expand the size of our images index in Universal Search
  • “Site:” query update
  • International launch of shopping rich snippets
There are also several updates related to freshness, sitelinks and related searches.