Search the Community

Showing results for tags 'seo'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Webmaster's Category
    • WebMasters Help Forum - Free Google SEO Tips and Guidelines


  • Community Calendar

Found 49 results

  1. Google recently made lot of updates in its search algorithm update. But they disclosed few updates only for public and most of them never got disclosed. Myself ( i own few websites ) and my friends website, they all used to have a good traffic for few years and months and once i/them moved to https.. Its all gone. I asked for help in google forum and other seo webmasters. But nothing helping, no htmls errors, no https redirect errors and nothing. Everything perfect. it took 4 weeks to move all our http links to https. Google announcement that they provide rank boost for https website, its just a big black monstrous lie. Who gets benefits from moving to HTTPS. 1. News websites 2. E-commerce websites 3. Websites which collects bank details or card details. Note : Moving to https is risk and do not try right now. Google is testing things and it can roll back if they get lot of issues in https. Drawbacks 1. You have to update all your back-links to https. 2. You never get any authority flow from your http to http sites. 3. Moving to https is like buying a new website and trying to get traffic to it.
  2. If one looks to hire an SEO agency, which one do you recommend? Actually, it will not be good enough to recommend an SEO agency personally, because they may change their policies and we do not know what different agencies might be doing. So, the general answer is, one can search for Google SEO guidelines, and you will get a whole page of Webmaster Help Center, which helps one in knowing what are the things one should look for, while choosing SEO agency. So, references should tell one, what that particular agency is going to do. It should not be like they are going to wave some magic smoke and not tell you what they are doing, be a little worried about it. Even though you find them boring, they should let you know everything they are about to do. Google has changed their SEO guidelines from little controversial to more concerned on the ways to find out a good SEO agency. There are lots of great SEOs out there. So, if an SEO does not satisfy you, just don't settle for less, when you can have more.
  3. Is it true that domains registered before 2004 have a totally different way of getting PageRank? Like 'Pre-2004' domains are highly desirable because they get PageRank based on old easier criteria. No, it is completely false. There is obviously no difference between 2004 domains, 2005 domains, 2006 domains. All the domains get reputation in the same way. Literally, there is no extra value in buying a pre-2004 domain or pre-Google IPO domain or whatever you want to name it. There's no difference absolutely. Just ensure to get a domain which works well for you. There is no need to think about if it was created before 2004.
  4. Does the first link on a page matter a lot? Should I ensure that first link is really what I care the most?If so, should we modify CSS or JavaScript to show the correct link first? Generally, it is better not to worry about it. If one has a thousand links, I wouldn't make it a thousand and first but there is no special advantage in having it be the first link. Google will parse a page and they wishes to extract hundreds of links to find the relevant ones. Page should be there in a place, where the user can see and click through it, Googlebot can follow that link, and ons must be in a good shape. So, it is not recommended to worry about bending over backwards or doing CSS or JavaScript to ensure that you show the first link is the most important link. Google tries to find all of those links.
  5. 'Query deserves freshness.' Fact or fiction? Definitely, it's not a fiction, but its a fact. In the New York Times, Amit Singhal has talked about it. In that, he says that he believed there are some queries which deserve freshness. So, Query Deserve Freshness (QDF) is really a fact, not fiction.
  6. 'Query deserves freshness.' Fact or fiction? Definitely, it's not a fiction, but its a fact. In the New York Times, Amit Singhal has talked about it. In that, he says that he believed there are some queries which deserve freshness. So, Query Deserve Freshness (QDF) is really a fact, not fiction.
  7. Will one be penalized for having every file in XML Sitemap listed with the same priority? Definitely, one will not be penalized for this. If one gives same importance to all files in XML Sitemap, then Google will try to find out which ones are really important from their perspective. So, one need not to worry about having to list a priority for every single one. Google is not going to have any sort of scoring or whatever. It is completely fine and optional as well. So, if one has some details for why they wish to put it in a priority, then it is really great. But, to be honest, it is not a must-one and it will really not get one into trouble, if you don't have it.
  8. How Google calculates site load times in the data it exposes in Google's webmaster statistics? Is the calculation simply average time to get and receive the HTML content for a page? Actually, it's a Yes. It works as, Googlebot sends out the request and beginning from there, Google calculates the time it takes for them to see the request back. So it is almost like, the end-to-end time to deliver the page or deliver the data from the server. Google sees it from the view of Googlebot. Google has no idea how to calculate the time for any given user to deliver a page. So, Google sees only at Googlebot's perspective.
  9. Do you have any specific tips for news sites, which have unique concerns compared with commercial sites? For an instance, let's say, if it's a developing news story, it is recommended to have one page, where all the page rank can gather. For an another instance, you might come across, who do many stories over several days. They do not link those stories together. They will keep it on the track, so that less likely to lose a few people through the cracks that way. And then, you can take Wikipedia, where they have one page, which gets richer and more developed. If a news story is over, and you can think of moving to a new page. But given a certain story, it is better to add updates or add more information on the same URL. Or you can think about other stuff. Take a look at Google News documentation. There is some meta tags and other tags that they have available which are not available to other people or that Google News treats specially. You can give a thought to Authorship, which helps in understanding who exactly wrote a particular story or particular page. If it's a news site, you can dig a little research on those lines.
  10. Can you explain about the proposed autocomplete type attribute? Should we add this on web forms? Many websites have forms which ask for the name (first and last name), the address, the street address, the postal code and all such stuffs. Visitors, usually, feel irritated or lazy to fill up these forms. If you are a business owner or a publisher, it would be a lot better if you make it easy for them to fill up those forms, so that they would like to do purchases or sign up for the newsletter or whatever you are interested in. An easy way to make this simpler is, first take the existing web form and Google Chrome and there is proposed a standard called autocomplete type, which means a software function that completes words or strings without the user needing to type them in full. It does not alter the form elements m i.e. the variables are same, so it's just only adding. But by commenting the forms with the right thing which you expect people to fill in with the browser's auto-complete, Chrome will know how to fill out the forms. So when a Chrome user visits the page and wish to buy something and they type something in, then they can see an option to autocomplete. It acts like, if you type in the first box, rest will get filled automatically. This makes the work simple. It should be semantically understandable in some sense. As a result, users will fly right through the form. They can sign up for newsletter, purchase or whatever. It is highly recommended to have this, but it may take some hours of yours, but it's really worth it.
  11. Is it good to keep key content pages close to the root or have them deep within a topical funnel-structure? Actually, the most noticeable fact is, this is not SEO advice, but behavioral advice. If one has stuff, some number of clicks from the root page, visitors will seem to find it. But if somebody has to click ten times to find the page to register for the admission compared to register right on the main root page, only some people will find it if it's all that many clicks away. So it is not a big deal where it is in the path, it may be at the root level or eight levels deep. Maybe, it's a big deal for other search engines, but not for Google. The only thing matter is, one should think from visitors perspective. And to be honest, this is not search engine ranking advice, but just an opinion to enhance the ROI ( return on investment).
  12. What are your views on PR sculpting? Useful and recommended? or unethical? Actually, it is not unethical, as their stuff on the website, one is allowed to handle how PR flows within your site. But, to be honest, it's not the first thing one would rely on. Instead, one must focus on getting more links, having higher quality content. Still, if one has certain amount for PR, they can scope for PageRank. It is not necessary to do it with a no follow tag, still one can put a no follow on a login page or something that is customized, where a robot will never log in. An effective form of PR is to choose which things to link from your homepage. For an instance, if you have two different pages, in which one product gets you a lot of money every time when someone buys, where an another product gets you less. One should highlight this page and ensure that it gets enough PR. So one has to link such kind of page from the homepage. PR sculpting is not only about no follow and stuffs but also about the ways one choose to create a site, link between the pages. It's not unethical to have all the links come into the site and then one decides how to link within the site and how to make the pages within the site. A better way to rank well is to have good content, so that you will have more links. PR sculpting will be helpful, but its not the first thing to work on.
  13. Does Google crawl and treat tiny URLs using a 301 redirect the same as other links? It will be much better if we use URL-shorteners rather than tinyURLs. And coming back to the question, whenever Google index one of 301 redirects, they do follow and flow the PageRank as normally they would with a 301 from any other site. Danny Sullivan did a great work about URL-shortening services. In that, he took the top URL-shortening services such as, and questioned do they do a 301-redirect or do they do some other sort of redirect. When a shortening service does a 301 redirect, Google should flow all the PageRank just like they do with any other sort of 301-redirect. Google can be able to follow that and find destination URL without a big deal, with the help of URL-shortening services which do it in a right way.
  14. Does the position of keywords in the URL have a significant impact? Actually, it is better not to worry about it much. Yeah, it helps in some level if one has keywords in the URL. But it won't be useful when you stuff lots of keywords in the URL. It will seem easy and nice, when you have four to five keywords. But it is better not to worry about it much like how deep is the URL in the path or how one should combine it. For an instance, when you do a blog, it is recommended to take two to five important words related to the post and use it in the URL. But it will appear spammy or weird, when you stuff eight, ten, twenty words in the URL. People will less likely to click it. So position is second order kind of thing of keywords in the URLs. One need not to worry about it much, instead they should focus kn having great content that people want to link to and people want to find out about.
  15. Will therefore be any issues, if the h1 tag appears below the h2 tag in the code. Does the spider still know what's going on? Actually, one need not to be worried about it, as Google handles h1s and h2s so well. So, just do not make a whole page h1 or h2. People put up lots of stuff on the web. As per some study many years before, forty percentage of the web pages had syntax errors. So, it is not a big deal of having one h1 below an h2. So many people do broken web pages, ugly web pages and pages that are not really HTML. Still, Google tries to process it, as those pages may have some good details in it. So, don't worry about having, some out of order h1s or h2.
  16. On what conditions, Google will display "Did you mean" search results above normal results? Actually, "Did you mean" search results began to be displayed since November 2008. Google will show it, when they were more confident that there was something really helpful. Mostly, not all the users will click on the "Did you mean" results, as they are not savvy or they might have not looked at it. They get blindness and so they do not click on that result, which will be really useful to them. Google wishes to show the new interface, when they think there is higher probability that this is going to help users, not just a normal search result, but really useful, like they misspelled it badly or some great things wait for them in that results, but only if only they will do corrected search. Maybe it will not be 100% perfect every time. One can always use the plus sign or put a phrase or a keyword in double quotes, to do the exact search one wish to have for, if Google returns bad results. They are trying to learn and enhance on the individual algorithms. Google has seen a real quality enhancement for the majority of users.
  17. What impact do site load times have on Google rankings? ctually, it can be answered in two ways, like in a short and long manner. The short answer is, at present, it does not have any impact. If a site takes long time to load that Google can't even bring it, Googlebot can't even get a copy of it, then it will have impact on your rankings, as it is essentially timing out. If your site takes like, 20 or 30 seconds to respond to requests, then it's really a big deal. But if a site takes one second versus two seconds, which will have no impact on Google's rankings. A little long answer is, according to Larry Page, the web should be really faster just as a magazine, where one can turn the page, when they are ready for the next one. Chrome was built with the motive to make the web really fast with a good experience and performance. So, for now, site load times have no impact on Google rankings, but who knows what the future might bring. If I make a guess about it, Google would want the web to be much faster and they would be thinking about encouraging people to make their sites faster. If your site is faster, visitors will be happy about it and they will visit or use your site frequently. So, it is interesting that Google wants the web to be fast and wants sites to load quickly.
  18. Are product description pages on an e-commerce site termed as duplicate content, if the same description appears on other sites? Usually, It happens for many branded products. Actually, it's a Yes, it happens. And mostly it happens because it's not original content. So, when one gets an affiliate feed, it might or might not have images, one has the same content on your page as well in e-commerce product page as four hundred other websites. Here comes no differentiation and no way to individualize yourself from others. This is when one should have their value ad. One should question yourself like '"What does my affiliate side or my website that does not have original content ad compared to these other hundreds of sites?" So, it is really good to have original content and try to have a really unique value ad. It is not recommended just to take an affiliate feed, create a site in a swoosh, because you do not give any reason for the people to rush on your site. The best thing is to find a way to have some unique angle and ensure that you don't end up in same exact stuff on other sites as well.
  19. Do dates in the URL of blogs or websites help in determining the freshness of the content? Actually, the dates in the URL or in the content is really helpful, but people may maximize it and say, it's always ten minutes old. So Google has their own ways to find out how fresh pages are. For an instance, the first time that our crawler saw a page. They look at revisiting pages how much the content changes. So it is a better to have a clear URL because visitors will somehow figure out how old the content is. One need not to do it for Googlebot's sake, because even though it is a usability thing, Google has their own ways of finding how fresh the content is. One really need not to worry about having the date in the URL or in the content for the sake of convincing Google about their freshness, as they already do that computation to determine for themselves.
  20. What is the best way to include the text of a company logo for SEO purposes? ALT tag or CSS hiding? One should have little more consideration in choosing this, because it really matters. It is recommended to use an ALT tag rather CSS hiding. This is because the ALT tag was more or less built for. ALT tag is also known as ALT attribute. But, the notable fact is, one can go ahead with ALT and that will be helpful to say, "This is the text in my logo." Search engines can read that and use it. One need not to hide it with CSS or anything like that, because the ALT tag is not only valid and simple but also works very fine.
  21. Underscores or dashes in URLs, Are there differences between my-page and mi_pagina? Yes, they are different. If one can opt between underscores and hyphens, I would go for hyphens (-) personally. But, if you have underscores (_) and everything works so good, one need not to worry about it and there is no need to change their architecture. A team inside Google, works on using underscores as separators, which is little bit different and which might have bigger impact. But there may be alterations in that, as far as there is no confirmation about it. But, as for now, hyphens (-) are treated as separators and underscores (_) are not. This might change in future, but not now.
  22. In some queries, Google uses the date of the post in the description snippet at Google search. Is there any reason for it? Is there any way one can say not to mention it? Google's snippets team always try their best ways to show really helpful descriptions or snippets in the search results. When you are on a forum, maybe there has been four replies. When you are on a blog, maybe there has been 40 comments on that blog post. So, snippets team trying to think about some new wats to have useful snippets. Highlighting the date of the blog post appeared is one of those ways of helpful snippets. They do this because, if something is recent, it will be helpful to you in one way or other. Google has the right to show the snippet which they think the best for users. Google has the right to show the part of the page or mention the date of the post when it went live. They do these things because they want the best for their users. And at present, there is no such way to say Google not to mention the date of the post in snippets.
  23. Google has been more proactive in providing results that feature "corrected" spellings. How Google will employ smart guesses in search results in the future? Google has lots of visitors every day, in that, some users are not savvy and some users do not always spell correctly. When you look at random queries, 10% of them might be misspelled. So, Google has decided to write one of the world's best Spellcheckers. But the point is, when one has a huge Click Through on "Did you mean", some people who did not know that it was there. Because of this, Google has introduced a change recently, where they will spell it correct, what they think is the correct answer for few results and then they will show the normal answers below. When a user does not know how to spell the word correctly, this helps you a lot. It helps Web spam because people are used to typos and misspellings, but the regular users do not stumble on spam, but they stumble on the good results. If you are a power user and there are a ton of people who are, one can always put a plus before words to say, "This is the exact word I meant to search for." One can also put it in double quotes, even for a single word. There are so much ways to say Google this is the exact word I meant to search for. Google tries to be smart. If someone types something, which seems to be misspelled, actually it's not, then Google figures out that over time. Google always try to come up with something, which is pretty reasonable and works for vast number of users. Google tries to find out a way to enhance for the next generation of algorithms or in the next time that we push out new data for the Spell Correcting Code.
  24. If one has a lot of blog content for a new section of a site (100+ pages), is it best to release it over a period of time or is it fine to just unleash 100 pages? Generally, one can unleash those hundred pages at a time, if it is a high quality content. But, when it comes to ten thousand or a hundred thousand or a million pages, one should be little more careful about it. One should be cautious not because it would cause any kind of automatic penalty, but because it will create a junk look. It means, one day, there is no content on the site, but other day, suddenly there is two million pages in the index. So, one may began to think whether it is a legitimate content or an auto-generated junk. When it appears to be that, there is no value added to the content. Mostly, when you are creating content organically, one will end up with a page at a time. So it is recommended to publish the content in a new page, when you have it. One need not to wait till, they have got a lot of different content and gather it together and release it. It's really a good thing to release a page at a time. One need not to worry about it, when it is a small scale high quality stuff.
  25. What are the factors in determining of PageRank of a Twitter page? Is it the followers or backlinks? One should remember the fact that, Google treats Twitter pages as regular web pages. Google does not consider the number of followers on Twitter, but they only consider the number of links that one has to your profile or to the specific tweets. So, if we have a lot of PageRank or a lot of linking to one specific interesting status message which might be more likely to show up in Google search results. Google does not consider number of followers because they knew that followers can be cheated very easily on a lot of different sites. So, they always bases on backlinks and the reputation of those backlinks. This is the way to figure out how reputable a page is - on Twitter just like on any other site across the web.