Search the Community

Showing results for tags 'google search'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Webmaster's Category
    • WebMasters Help Forum - Free Google SEO Tips and Guidelines

Calendars

  • Community Calendar

Found 37 results

  1. Will SEO still exist in 5 years? Yes, Search Engine Optimisation will still exist in 5 years. Google tries to make a thing that one does not need to be an SEO expert. Actually, SEO helps in putting the best foot forward for your website. So, the notable fact is SEO is not spam. One can use several white hat techniques to do SEO. Canonicalization helps in ensuring that all your URLs get the backlinks they demand, so that one do not have a lot of duplicate content. There are so many things one can do as a search engine optimizer. SEO ensures that the site is represented well, have good impression. It is really beneficial for visitors. So, for the fact, SEO will definitely still exist in 5 years. One must be aware that SEO should not turn to black hat techniques or illegal stuff, as it collapses the website, so that it will not be productive. People have to learn about white hat SEO, but need not to be an SEO expert. To learn about SEO, one can use webmaster console or other search engine's consoles. Google provides SEO starter guide, they will not do this, if they consider SEO does not have some value. So, it's a big YES, SEO will definitely be around in 5 years and that's a good thing.
  2. If one looks to hire an SEO agency, which one do you recommend? Actually, it will not be good enough to recommend an SEO agency personally, because they may change their policies and we do not know what different agencies might be doing. So, the general answer is, one can search for Google SEO guidelines, and you will get a whole page of Webmaster Help Center, which helps one in knowing what are the things one should look for, while choosing SEO agency. So, references should tell one, what that particular agency is going to do. It should not be like they are going to wave some magic smoke and not tell you what they are doing, be a little worried about it. Even though you find them boring, they should let you know everything they are about to do. Google has changed their SEO guidelines from little controversial to more concerned on the ways to find out a good SEO agency. There are lots of great SEOs out there. So, if an SEO does not satisfy you, just don't settle for less, when you can have more.
  3. Is it true that domains registered before 2004 have a totally different way of getting PageRank? Like 'Pre-2004' domains are highly desirable because they get PageRank based on old easier criteria. No, it is completely false. There is obviously no difference between 2004 domains, 2005 domains, 2006 domains. All the domains get reputation in the same way. Literally, there is no extra value in buying a pre-2004 domain or pre-Google IPO domain or whatever you want to name it. There's no difference absolutely. Just ensure to get a domain which works well for you. There is no need to think about if it was created before 2004.
  4. Does the first link on a page matter a lot? Should I ensure that first link is really what I care the most?If so, should we modify CSS or JavaScript to show the correct link first? Generally, it is better not to worry about it. If one has a thousand links, I wouldn't make it a thousand and first but there is no special advantage in having it be the first link. Google will parse a page and they wishes to extract hundreds of links to find the relevant ones. Page should be there in a place, where the user can see and click through it, Googlebot can follow that link, and ons must be in a good shape. So, it is not recommended to worry about bending over backwards or doing CSS or JavaScript to ensure that you show the first link is the most important link. Google tries to find all of those links.
  5. Will Google find text in images someday? Actually, it is easy to say in words, but it is a big undertaking in real. But, it will be fun at the same time. Actually, it will be great, if Google crawled the web, found all the images and ran OCR (optical character recognition) on all the images on the web. But, to be honest, it is too high to dream, as it involves loads of work. The notable fact is, one should expect this from Google in any short term now.
  6. Will one be penalized for having every file in XML Sitemap listed with the same priority? Definitely, one will not be penalized for this. If one gives same importance to all files in XML Sitemap, then Google will try to find out which ones are really important from their perspective. So, one need not to worry about having to list a priority for every single one. Google is not going to have any sort of scoring or whatever. It is completely fine and optional as well. So, if one has some details for why they wish to put it in a priority, then it is really great. But, to be honest, it is not a must-one and it will really not get one into trouble, if you don't have it.
  7. How Google calculates site load times in the data it exposes in Google's webmaster statistics? Is the calculation simply average time to get and receive the HTML content for a page? Actually, it's a Yes. It works as, Googlebot sends out the request and beginning from there, Google calculates the time it takes for them to see the request back. So it is almost like, the end-to-end time to deliver the page or deliver the data from the server. Google sees it from the view of Googlebot. Google has no idea how to calculate the time for any given user to deliver a page. So, Google sees only at Googlebot's perspective.
  8. Will therefore be any issues, if the h1 tag appears below the h2 tag in the code. Does the spider still know what's going on? Actually, one need not to be worried about it, as Google handles h1s and h2s so well. So, just do not make a whole page h1 or h2. People put up lots of stuff on the web. As per some study many years before, forty percentage of the web pages had syntax errors. So, it is not a big deal of having one h1 below an h2. So many people do broken web pages, ugly web pages and pages that are not really HTML. Still, Google tries to process it, as those pages may have some good details in it. So, don't worry about having, some out of order h1s or h2.
  9. Are product description pages on an e-commerce site termed as duplicate content, if the same description appears on other sites? Usually, It happens for many branded products. Actually, it's a Yes, it happens. And mostly it happens because it's not original content. So, when one gets an affiliate feed, it might or might not have images, one has the same content on your page as well in e-commerce product page as four hundred other websites. Here comes no differentiation and no way to individualize yourself from others. This is when one should have their value ad. One should question yourself like '"What does my affiliate side or my website that does not have original content ad compared to these other hundreds of sites?" So, it is really good to have original content and try to have a really unique value ad. It is not recommended just to take an affiliate feed, create a site in a swoosh, because you do not give any reason for the people to rush on your site. The best thing is to find a way to have some unique angle and ensure that you don't end up in same exact stuff on other sites as well.
  10. Do dates in the URL of blogs or websites help in determining the freshness of the content? Actually, the dates in the URL or in the content is really helpful, but people may maximize it and say, it's always ten minutes old. So Google has their own ways to find out how fresh pages are. For an instance, the first time that our crawler saw a page. They look at revisiting pages how much the content changes. So it is a better to have a clear URL because visitors will somehow figure out how old the content is. One need not to do it for Googlebot's sake, because even though it is a usability thing, Google has their own ways of finding how fresh the content is. One really need not to worry about having the date in the URL or in the content for the sake of convincing Google about their freshness, as they already do that computation to determine for themselves.
  11. What is the best way to include the text of a company logo for SEO purposes? ALT tag or CSS hiding? One should have little more consideration in choosing this, because it really matters. It is recommended to use an ALT tag rather CSS hiding. This is because the ALT tag was more or less built for. ALT tag is also known as ALT attribute. But, the notable fact is, one can go ahead with ALT and that will be helpful to say, "This is the text in my logo." Search engines can read that and use it. One need not to hide it with CSS or anything like that, because the ALT tag is not only valid and simple but also works very fine.
  12. Google has been more proactive in providing results that feature "corrected" spellings. How Google will employ smart guesses in search results in the future? Google has lots of visitors every day, in that, some users are not savvy and some users do not always spell correctly. When you look at random queries, 10% of them might be misspelled. So, Google has decided to write one of the world's best Spellcheckers. But the point is, when one has a huge Click Through on "Did you mean", some people who did not know that it was there. Because of this, Google has introduced a change recently, where they will spell it correct, what they think is the correct answer for few results and then they will show the normal answers below. When a user does not know how to spell the word correctly, this helps you a lot. It helps Web spam because people are used to typos and misspellings, but the regular users do not stumble on spam, but they stumble on the good results. If you are a power user and there are a ton of people who are, one can always put a plus before words to say, "This is the exact word I meant to search for." One can also put it in double quotes, even for a single word. There are so much ways to say Google this is the exact word I meant to search for. Google tries to be smart. If someone types something, which seems to be misspelled, actually it's not, then Google figures out that over time. Google always try to come up with something, which is pretty reasonable and works for vast number of users. Google tries to find out a way to enhance for the next generation of algorithms or in the next time that we push out new data for the Spell Correcting Code.
  13. If one has a lot of blog content for a new section of a site (100+ pages), is it best to release it over a period of time or is it fine to just unleash 100 pages? Generally, one can unleash those hundred pages at a time, if it is a high quality content. But, when it comes to ten thousand or a hundred thousand or a million pages, one should be little more careful about it. One should be cautious not because it would cause any kind of automatic penalty, but because it will create a junk look. It means, one day, there is no content on the site, but other day, suddenly there is two million pages in the index. So, one may began to think whether it is a legitimate content or an auto-generated junk. When it appears to be that, there is no value added to the content. Mostly, when you are creating content organically, one will end up with a page at a time. So it is recommended to publish the content in a new page, when you have it. One need not to wait till, they have got a lot of different content and gather it together and release it. It's really a good thing to release a page at a time. One need not to worry about it, when it is a small scale high quality stuff.
  14. What are the factors in determining of PageRank of a Twitter page? Is it the followers or backlinks? One should remember the fact that, Google treats Twitter pages as regular web pages. Google does not consider the number of followers on Twitter, but they only consider the number of links that one has to your profile or to the specific tweets. So, if we have a lot of PageRank or a lot of linking to one specific interesting status message which might be more likely to show up in Google search results. Google does not consider number of followers because they knew that followers can be cheated very easily on a lot of different sites. So, they always bases on backlinks and the reputation of those backlinks. This is the way to figure out how reputable a page is - on Twitter just like on any other site across the web.
  15. Is it good to put a 'coming soon' page for new domains? Yes, it's a pretty smart thing to put a "coming soon" on a page for new domains. It's good thing for the visitors, because they just don't end up on a "black hole page" atleast. If you have some content which is ought to come out, then there is no wrong from having a "coming soon" page. Then when one gets more content, they can put that content out there. So that, when the full site is ready, one can have the full site out there. So, there is no need to worry about, because it's not even a big deal, when it comes to ranking-wise. It's a good option for users and it can be a good thing for search engines as well.
  16. Why Google search does not treat the @ sign differently given the rise of Twitter? For example @google and google give me the same results. Actually, this is a considerable choice as Google did not want an index email addresses, at least. One does not want somebody scraping Google to find a whole lot of email addresses. So it's really a thoughtful option not to index the @ sign. May be its over time starting to treat that differently, but for now, that has not been a well known thing or a request that Google had heard enough, which they would probably put resources into it at present.
  17. What is the power of keyword domains? And what kind of domain one can go for? If one is registering a new domain name, they wish to compete in particular niche within SEO. So one can opt for two different strategies. One can opt for brandable names such as Twitter, Tumblr, etc. People will remember these kind of names without having keywords in the domain name. Otherwise one can opt for the keywords in the domain name. For an instance, dressshoponline.org or dressshoponline.net. People have acceptable reasons to disagree to have keyword in the domain- laden domain or a domain which is not essential to have the keywords in it, but looks little brandable. One can also be successful without having keywords in the domain. For example: Zynga, which does not have anything about socialising or gaming in the domain name. Twitter, Facebook, Google, Yahoo - these names are brandable, which one can remember instantly, tend not to be those keyword-laden domains. If one wish to have keywords in the domain, they might have one benefit. That is, if you refer to the name of the business, you might link to it and then you might link to it with the same words that are in the keyword. Both options have possible good results, so one has to choose based on the goals and things one is interested in. It is better to opt for brandable choice because, for an instance, if one have 15 sites relating Android and if you go for laden domains, they all have the word 'Android', it will seem weird and little hard to remember. So, brandable name will be easy to remember, which helps in coming back to the site. For example: Reddit has nothing about the news, still it has interesting social news. If one wish to have a big success, choosing a little more brandable name will be the first positive move. Still, if you are in confusion, just take a look at the rankings and the weights that Google gives to keyword domains. Google gets complaints about giving little too much weight for keywords in domains. So, Google is going to adjust mix a little bit and turning down within the algorithm. So it would not necessarily help you to have a domain with a lot of keywords in it.
  18. Does Google consider SEO to be spam? Google does not consider SEO to be spam. SEO stands for Search Engine Optimization. It ensures that your pages are well represented within search engines. There is a lot of things one can do as a search engine optimizer. One can ensure that your pages are crawlable and accessible. One want people to find them just by clicking on links. One need to ensure that people use the correct key words. If one uses industry jargon or lingo, which is not well known, then a good SEO can help you find out the suitable keywords. One can think about usability and ensure that the design of the site is good. One can think about how to make the site faster. Google uses site speed as one of the factors to determine the search rankings. Making the site run faster, gives better user experience. SEO also helps in making better site architecture, the URL structure, the templates and such stuffs to maximize the return on investment. There is no wrong in these white hat methods. There are also some SEOs which try to employ black hat techniques, people that hack sites or key word stuff and repeating things or doing sneaky things with redirects. But Google's aim is to ensure that the best possible search results are given back. SEOs can help by co-operating and trying to help search engines find pages better. SEOs are highly useful, but they can also be misused and overdone. It's important to know that search engines are not smart as people so far. Google works on it. They are trying to find out what exactly people mean. They are trying to figure out synonyms, vocabulary and stemming, so that one need not to worry about the right word to search for what you want to find. Till then, SEOs can be helpful to people find what they are looking for via search engines. So Google does not consider SEO to be spam.
  19. Would you spend time with a perfectly optimized website or a barrel full of kittens? Well, in my case, I will definitely go for the second option, without giving a second thought. One has to remember the fact that they have sort of an emotional appeal. They tug at your heart strings. If you add some emotional appeal to the site, it should be sort of fascinating the visitors. It just not be like perfectly optimized in terms of H1s. The users will respond to it, more likely to click it, if there is something like Cute Overload. It is something like adding cute things to the web design, say, for example, pictures of kittens. The only fact is, one should figure out the right niche to be optimizing for. In simple terms, the website should not be completely rational cold hearted one, which is purely functional, it should have some kind of fun there. For an instance, ball of string on your 404 page, or something which seems to have quirky good humour. The visitors will definitely love it, more likely to link to it and share with their friends.
  20. Google has said that frequently updated pages get boosted in rankings (QDF), which favours blogs and news sites over company sites, which have less reason to be updated often. How important of a signal is 'freshness'? Actually, one has to understand the fact that if you are updating the page or blog often, this does not mean your page or blog should be ranked higher. So, you should not have the interpretation of freshness. Let's have the little flashback of what Andrei Broder said about the types of searches that people do. It could be navigational or informational or transactional. For an instance, navigational might be about HP or IBM, looking for home page. Informational might be about looking for how to do something like setting the default printer in Firefox. Transactional might be about shopping, looking for the products they wish to buy. But it's a very broad hierarchy. If you dig down a bit, you might be able to see that people seek for something fresh. For example, if someone searches for just happened occasion, that would be QDF. This sort of event deserve freshness. Every query does not need freshness, just as if navigational or evergreen. People search for long form content and do more research. Then freshness factor will not be so much important. Google has over 200 signals which they use. One should not fall on the trap that they must definitely have fresh content. Therefore, they are going to alter some words on pages and byline date so that it seems fresh. It doesn't lead to higher rankings. If you are not in an area about news or sort of topic which needs fresh content, then you need not to worry about it. There is content, which evergreen, that lasts and stands the test of time. It is better to work on such articles rather than jumping on the lastest story. It is better not to worry about freshness. If you are writing about hot breaking gadget news, it is better to stay fresh and ensure content is relevant. You need not to re-write the pages or changing words on the page, just to seem fresh.
  21. With canonical tags, can you point that page to itself? Say for an instance, if www.google.com/webpage points to www.google.com/webpage , will this cause a loop? The rel=canonical element is referred as "canonical link". It means, an HTML element that helps webmasters prevent duplicate content issues by specifying the “canonical” or “preferred” version of a web page. It enhances site's SEO. Even if canonical tags cause a loop, it does not cause a problem in Google. If you have a rel canonical page which loops back to itself, whenever Google building and writing the code for the rel canonical tag attribute in Google and they built in support to ensure this does not cause any sort of problem. It's definitely a common thing in other search engines as well. What if you had to check every single URL and then do a self check to see if you were on that URL. If you were on that URL, then you could not have a rel canonical tags, which produces all those tags. If you wish to have rel canonical tag on every single page, there is no problem in that. And even if it loops back to itself, then it is not a bog deal at all. Google is so fine with it.
  22. What is Google's view on guest blogging for links? High quality guest blogging is worth in some cases. In some case, what if it gets extreme? Or what if someone take it too far? So for an instance, we shall take an easy case. If you are a high quality writer, you give shout out to Lisa Barone or someone. She is a blogger and maybe she wishes to do a guest blog on some other blog. They must be happy to have her. For an instance, Vanessa Fox, Danny Sullivan and such people write something on a different blog. Actually, one must be happy to have them write an article for you, as they bring lot of insight and knowledge. It will be a good win for the person who hosts the site. It will be great for those who is not so much popular, but writes really well to get to be known a little bit better. Sometimes it can be taken to so much extremes. In that case, you can see people offering the same blog post many times or spinning the blog post and offering it to several outlets. So it will become like low-quality article. It is like someone is going to write the blog post and actually going to outsource that to non-expert. And then they would insert hyperlinks, so that would get into their blog post. It's a long and time honored tradition to have high quality bloggers jump back and forth or collaborate in different ways. One should be careful because it can be taken to extremes. Some practices, which make a lot of sense when you think about it, with high quality people, you are doing get a massive number of links, which is less likely to be counted by Google. There is a case for stuffs, where one think hard about the message they want to say. They will have their own point of view. The kind of links that Google will count on, is the higher quality articles, where someone put their effort in it and in which there will be originality. It gives a bit of a feel for the space. Guest blogs have a little bit more higher value and they might not be as worth the time.
  23. If someone buys a text link from my website, I add the nofollow tag so it has become a paid link. Should one do the same for banner images too? Generally, speaking about banner ads, most of those are sold through d exchanges or several advertising networks, which will do block out bots. They do this because they do not want bots crawling their banner ads and messing with their impressions or clicks counts etc. If you are using any standard advertising banner ad package, most probably, redirects will go through things that are blocked by robots.txt or which are not really crawlable search engines. One need not be worried about it. If you sell a link directly and the only difference is one is an image versus one is a text link. If one paid for it specifically, then I would put a nofollow on the link. Even if it's an image link. Mostly, Google handles the typical banner ad and such stuffs well. So they do not float page rank and things are handled accordingly. In this case too, one need not be worried about it.
  24. Search for a physical product ranks Amazon number one, in spite of not offering the best user experience. What they have done to prevent large corporations from dominating search engine results? Generally, one has to accept the fact that Amazon has relatively good user experience. Though, it is not necessary to accept that Amazon always ranks number one for every physical product. For an instance, when one search for a book, Amazon is up there. But if there is an official homepage for a book, it ranks well and maybe number one as well. But an acceptable fact is, not every book has homepage. It still wonders me. Every author will have their own web page, but they will not have a dedicated page for that particular book. It's just a lack of savoriness. When I searched for Mrs. Byrne's Dictionary of Unsual, Obscure and Preposterous Words, there was no content about it on the web except on Amazon or Goodreads or Google eBooks. The best answer is to ensure that there is an actual page for the product. Generally, Google tries to find out what are the official homepages. If it be for governments or universities or states or whatever. And Google tries to ensure that they return those when possible. Google minds about it when users do a search and they complain to them. If users complain about not finding the actual homepage for the product, then Google takes this into account. Generally, Google looks at the number of links, content of the page and if one specific webpage can get a lot of links, because visitors think it's a great site, then it relatively ranks well.
  25. How can you quote correctly from different sources without getting penalized for duplicated content? Is it possible to quote and refer to the source? Let's take two instances. First one is that you are a normal blogger, and you want to quote an excerpt from some author you like or from some other blog, which has a good insight. It is recommended to put that in a block quote, and add a link to the original source, so that you need not to worry about it. Google has a way to find this kind of thing without making it as a issue. But you are about to quote an entire article from other website or many articles from other websites and you do not even make an attempt to add something on your own, then it will have impact on the reputation of how Google view your site. If you are a normal blogger, all you are doing is adding a quote from one site and you even add some value. You are not like including a quote with no other ranking or attribution or insight or research or whatever. If you are about to write a blog post, you have a paragraph and you add a link to the original source and you also include something about agreeing or disagreeing with that particular paragraph. Then it's not a big deal. Techdirt is a website which will include a little quote, but it will give its own perspective. So we can consider it as unique. These things are legal and Google is so fine with it.