Search the Community

Showing results for tags 'search engine'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Webmaster's Category
    • WebMasters Help Forum - Free Google SEO Tips and Guidelines

Calendars

  • Community Calendar

Found 26 results

  1. If one looks to hire an SEO agency, which one do you recommend? Actually, it will not be good enough to recommend an SEO agency personally, because they may change their policies and we do not know what different agencies might be doing. So, the general answer is, one can search for Google SEO guidelines, and you will get a whole page of Webmaster Help Center, which helps one in knowing what are the things one should look for, while choosing SEO agency. So, references should tell one, what that particular agency is going to do. It should not be like they are going to wave some magic smoke and not tell you what they are doing, be a little worried about it. Even though you find them boring, they should let you know everything they are about to do. Google has changed their SEO guidelines from little controversial to more concerned on the ways to find out a good SEO agency. There are lots of great SEOs out there. So, if an SEO does not satisfy you, just don't settle for less, when you can have more.
  2. 'Query deserves freshness.' Fact or fiction? Definitely, it's not a fiction, but its a fact. In the New York Times, Amit Singhal has talked about it. In that, he says that he believed there are some queries which deserve freshness. So, Query Deserve Freshness (QDF) is really a fact, not fiction.
  3. 'Query deserves freshness.' Fact or fiction? Definitely, it's not a fiction, but its a fact. In the New York Times, Amit Singhal has talked about it. In that, he says that he believed there are some queries which deserve freshness. So, Query Deserve Freshness (QDF) is really a fact, not fiction.
  4. Will Google find text in images someday? Actually, it is easy to say in words, but it is a big undertaking in real. But, it will be fun at the same time. Actually, it will be great, if Google crawled the web, found all the images and ran OCR (optical character recognition) on all the images on the web. But, to be honest, it is too high to dream, as it involves loads of work. The notable fact is, one should expect this from Google in any short term now.
  5. Do you have any specific tips for news sites, which have unique concerns compared with commercial sites? For an instance, let's say, if it's a developing news story, it is recommended to have one page, where all the page rank can gather. For an another instance, you might come across, who do many stories over several days. They do not link those stories together. They will keep it on the track, so that less likely to lose a few people through the cracks that way. And then, you can take Wikipedia, where they have one page, which gets richer and more developed. If a news story is over, and you can think of moving to a new page. But given a certain story, it is better to add updates or add more information on the same URL. Or you can think about other stuff. Take a look at Google News documentation. There is some meta tags and other tags that they have available which are not available to other people or that Google News treats specially. You can give a thought to Authorship, which helps in understanding who exactly wrote a particular story or particular page. If it's a news site, you can dig a little research on those lines.
  6. Can you explain about the proposed autocomplete type attribute? Should we add this on web forms? Many websites have forms which ask for the name (first and last name), the address, the street address, the postal code and all such stuffs. Visitors, usually, feel irritated or lazy to fill up these forms. If you are a business owner or a publisher, it would be a lot better if you make it easy for them to fill up those forms, so that they would like to do purchases or sign up for the newsletter or whatever you are interested in. An easy way to make this simpler is, first take the existing web form and Google Chrome and there is proposed a standard called autocomplete type, which means a software function that completes words or strings without the user needing to type them in full. It does not alter the form elements m i.e. the variables are same, so it's just only adding. But by commenting the forms with the right thing which you expect people to fill in with the browser's auto-complete, Chrome will know how to fill out the forms. So when a Chrome user visits the page and wish to buy something and they type something in, then they can see an option to autocomplete. It acts like, if you type in the first box, rest will get filled automatically. This makes the work simple. It should be semantically understandable in some sense. As a result, users will fly right through the form. They can sign up for newsletter, purchase or whatever. It is highly recommended to have this, but it may take some hours of yours, but it's really worth it.
  7. Does Google crawl and treat tiny URLs using a 301 redirect the same as other links? It will be much better if we use URL-shorteners rather than tinyURLs. And coming back to the question, whenever Google index one of 301 redirects, they do follow and flow the PageRank as normally they would with a 301 from any other site. Danny Sullivan did a great work about URL-shortening services. In that, he took the top URL-shortening services such as tinyURL.com, bit.ly/ and questioned do they do a 301-redirect or do they do some other sort of redirect. When a shortening service does a 301 redirect, Google should flow all the PageRank just like they do with any other sort of 301-redirect. Google can be able to follow that and find destination URL without a big deal, with the help of URL-shortening services which do it in a right way.
  8. On what conditions, Google will display "Did you mean" search results above normal results? Actually, "Did you mean" search results began to be displayed since November 2008. Google will show it, when they were more confident that there was something really helpful. Mostly, not all the users will click on the "Did you mean" results, as they are not savvy or they might have not looked at it. They get blindness and so they do not click on that result, which will be really useful to them. Google wishes to show the new interface, when they think there is higher probability that this is going to help users, not just a normal search result, but really useful, like they misspelled it badly or some great things wait for them in that results, but only if only they will do corrected search. Maybe it will not be 100% perfect every time. One can always use the plus sign or put a phrase or a keyword in double quotes, to do the exact search one wish to have for, if Google returns bad results. They are trying to learn and enhance on the individual algorithms. Google has seen a real quality enhancement for the majority of users.
  9. Google has been more proactive in providing results that feature "corrected" spellings. How Google will employ smart guesses in search results in the future? Google has lots of visitors every day, in that, some users are not savvy and some users do not always spell correctly. When you look at random queries, 10% of them might be misspelled. So, Google has decided to write one of the world's best Spellcheckers. But the point is, when one has a huge Click Through on "Did you mean", some people who did not know that it was there. Because of this, Google has introduced a change recently, where they will spell it correct, what they think is the correct answer for few results and then they will show the normal answers below. When a user does not know how to spell the word correctly, this helps you a lot. It helps Web spam because people are used to typos and misspellings, but the regular users do not stumble on spam, but they stumble on the good results. If you are a power user and there are a ton of people who are, one can always put a plus before words to say, "This is the exact word I meant to search for." One can also put it in double quotes, even for a single word. There are so much ways to say Google this is the exact word I meant to search for. Google tries to be smart. If someone types something, which seems to be misspelled, actually it's not, then Google figures out that over time. Google always try to come up with something, which is pretty reasonable and works for vast number of users. Google tries to find out a way to enhance for the next generation of algorithms or in the next time that we push out new data for the Spell Correcting Code.
  10. If one has a lot of blog content for a new section of a site (100+ pages), is it best to release it over a period of time or is it fine to just unleash 100 pages? Generally, one can unleash those hundred pages at a time, if it is a high quality content. But, when it comes to ten thousand or a hundred thousand or a million pages, one should be little more careful about it. One should be cautious not because it would cause any kind of automatic penalty, but because it will create a junk look. It means, one day, there is no content on the site, but other day, suddenly there is two million pages in the index. So, one may began to think whether it is a legitimate content or an auto-generated junk. When it appears to be that, there is no value added to the content. Mostly, when you are creating content organically, one will end up with a page at a time. So it is recommended to publish the content in a new page, when you have it. One need not to wait till, they have got a lot of different content and gather it together and release it. It's really a good thing to release a page at a time. One need not to worry about it, when it is a small scale high quality stuff.
  11. What are the factors in determining of PageRank of a Twitter page? Is it the followers or backlinks? One should remember the fact that, Google treats Twitter pages as regular web pages. Google does not consider the number of followers on Twitter, but they only consider the number of links that one has to your profile or to the specific tweets. So, if we have a lot of PageRank or a lot of linking to one specific interesting status message which might be more likely to show up in Google search results. Google does not consider number of followers because they knew that followers can be cheated very easily on a lot of different sites. So, they always bases on backlinks and the reputation of those backlinks. This is the way to figure out how reputable a page is - on Twitter just like on any other site across the web.
  12. Will SEO still exist in 5 years? Yes, Search Engine Optimisation will still exist in 5 years. Google tries to make a thing that one does not need to be an SEO expert. Actually, SEO helps in putting the best foot forward for your website. So, the notable fact is SEO is not spam. One can use several white hat techniques to do SEO. Canonicalization helps in ensuring that all your URLs get the backlinks they demand, so that one do not have a lot of duplicate content. There are so many things one can do as a search engine optimizer. SEO ensures that the site is represented well, have good impression. It is really beneficial for visitors. So, for the fact, SEO will definitely still exist in 5 years. One must be aware that SEO should not turn to black hat techniques or illegal stuff, as it collapses the website, so that it will not be productive. People have to learn about white hat SEO, but need not to be an SEO expert. To learn about SEO, one can use webmaster console or other search engine's consoles. Google provides SEO starter guide, they will not do this, if they consider SEO does not have some value. So, it's a big YES, SEO will definitely be around in 5 years and that's a good thing.
  13. Why Google search does not treat the @ sign differently given the rise of Twitter? For example @google and google give me the same results. Actually, this is a considerable choice as Google did not want an index email addresses, at least. One does not want somebody scraping Google to find a whole lot of email addresses. So it's really a thoughtful option not to index the @ sign. May be its over time starting to treat that differently, but for now, that has not been a well known thing or a request that Google had heard enough, which they would probably put resources into it at present.
  14. Google has said that frequently updated pages get boosted in rankings (QDF), which favours blogs and news sites over company sites, which have less reason to be updated often. How important of a signal is 'freshness'? Actually, one has to understand the fact that if you are updating the page or blog often, this does not mean your page or blog should be ranked higher. So, you should not have the interpretation of freshness. Let's have the little flashback of what Andrei Broder said about the types of searches that people do. It could be navigational or informational or transactional. For an instance, navigational might be about HP or IBM, looking for home page. Informational might be about looking for how to do something like setting the default printer in Firefox. Transactional might be about shopping, looking for the products they wish to buy. But it's a very broad hierarchy. If you dig down a bit, you might be able to see that people seek for something fresh. For example, if someone searches for just happened occasion, that would be QDF. This sort of event deserve freshness. Every query does not need freshness, just as if navigational or evergreen. People search for long form content and do more research. Then freshness factor will not be so much important. Google has over 200 signals which they use. One should not fall on the trap that they must definitely have fresh content. Therefore, they are going to alter some words on pages and byline date so that it seems fresh. It doesn't lead to higher rankings. If you are not in an area about news or sort of topic which needs fresh content, then you need not to worry about it. There is content, which evergreen, that lasts and stands the test of time. It is better to work on such articles rather than jumping on the lastest story. It is better not to worry about freshness. If you are writing about hot breaking gadget news, it is better to stay fresh and ensure content is relevant. You need not to re-write the pages or changing words on the page, just to seem fresh.
  15. With canonical tags, can you point that page to itself? Say for an instance, if www.google.com/webpage points to www.google.com/webpage , will this cause a loop? The rel=canonical element is referred as "canonical link". It means, an HTML element that helps webmasters prevent duplicate content issues by specifying the “canonical” or “preferred” version of a web page. It enhances site's SEO. Even if canonical tags cause a loop, it does not cause a problem in Google. If you have a rel canonical page which loops back to itself, whenever Google building and writing the code for the rel canonical tag attribute in Google and they built in support to ensure this does not cause any sort of problem. It's definitely a common thing in other search engines as well. What if you had to check every single URL and then do a self check to see if you were on that URL. If you were on that URL, then you could not have a rel canonical tags, which produces all those tags. If you wish to have rel canonical tag on every single page, there is no problem in that. And even if it loops back to itself, then it is not a bog deal at all. Google is so fine with it.
  16. What is Google's view on guest blogging for links? High quality guest blogging is worth in some cases. In some case, what if it gets extreme? Or what if someone take it too far? So for an instance, we shall take an easy case. If you are a high quality writer, you give shout out to Lisa Barone or someone. She is a blogger and maybe she wishes to do a guest blog on some other blog. They must be happy to have her. For an instance, Vanessa Fox, Danny Sullivan and such people write something on a different blog. Actually, one must be happy to have them write an article for you, as they bring lot of insight and knowledge. It will be a good win for the person who hosts the site. It will be great for those who is not so much popular, but writes really well to get to be known a little bit better. Sometimes it can be taken to so much extremes. In that case, you can see people offering the same blog post many times or spinning the blog post and offering it to several outlets. So it will become like low-quality article. It is like someone is going to write the blog post and actually going to outsource that to non-expert. And then they would insert hyperlinks, so that would get into their blog post. It's a long and time honored tradition to have high quality bloggers jump back and forth or collaborate in different ways. One should be careful because it can be taken to extremes. Some practices, which make a lot of sense when you think about it, with high quality people, you are doing get a massive number of links, which is less likely to be counted by Google. There is a case for stuffs, where one think hard about the message they want to say. They will have their own point of view. The kind of links that Google will count on, is the higher quality articles, where someone put their effort in it and in which there will be originality. It gives a bit of a feel for the space. Guest blogs have a little bit more higher value and they might not be as worth the time.
  17. If someone buys a text link from my website, I add the nofollow tag so it has become a paid link. Should one do the same for banner images too? Generally, speaking about banner ads, most of those are sold through d exchanges or several advertising networks, which will do block out bots. They do this because they do not want bots crawling their banner ads and messing with their impressions or clicks counts etc. If you are using any standard advertising banner ad package, most probably, redirects will go through things that are blocked by robots.txt or which are not really crawlable search engines. One need not be worried about it. If you sell a link directly and the only difference is one is an image versus one is a text link. If one paid for it specifically, then I would put a nofollow on the link. Even if it's an image link. Mostly, Google handles the typical banner ad and such stuffs well. So they do not float page rank and things are handled accordingly. In this case too, one need not be worried about it.
  18. Search for a physical product ranks Amazon number one, in spite of not offering the best user experience. What they have done to prevent large corporations from dominating search engine results? Generally, one has to accept the fact that Amazon has relatively good user experience. Though, it is not necessary to accept that Amazon always ranks number one for every physical product. For an instance, when one search for a book, Amazon is up there. But if there is an official homepage for a book, it ranks well and maybe number one as well. But an acceptable fact is, not every book has homepage. It still wonders me. Every author will have their own web page, but they will not have a dedicated page for that particular book. It's just a lack of savoriness. When I searched for Mrs. Byrne's Dictionary of Unsual, Obscure and Preposterous Words, there was no content about it on the web except on Amazon or Goodreads or Google eBooks. The best answer is to ensure that there is an actual page for the product. Generally, Google tries to find out what are the official homepages. If it be for governments or universities or states or whatever. And Google tries to ensure that they return those when possible. Google minds about it when users do a search and they complain to them. If users complain about not finding the actual homepage for the product, then Google takes this into account. Generally, Google looks at the number of links, content of the page and if one specific webpage can get a lot of links, because visitors think it's a great site, then it relatively ranks well.
  19. If a website has a 'Read More' dropdown window with the simplistic design elements. The box includes maximized content, keyword anchor text linking deeper into the site and a lot more. Will a site be penalized for this? Is there any alternatives? One must be aware of some issues regarding this. If the dropdown is one pixel long and one pixel deep and the thing which will be shown when you click on it is like eight pages filled with keyword stuffed anchor text, etc, it is definitely going to look worse. If you fit with the normal idiom and you have a navigation menu, if you click, it will show a paragraph or two, a link or two, that's the thing the different websites do. Google may not classify it as a hidden text, when you use a common framework to do the dropdown. It will be a good thing if one does not put like eight pages stuffs of content, which people will not make an attempt to read it. Googlebot sees the same content as the people wish to see. So no one want to visit a page which have a whole bunch of text hidden behind a dropdown. One must not worry about the most normal sort of mouse-activated dropdowns. There will be no problem at all, when you remember to have common idioms for the dropdowns with reasonable length.
  20. If someone manage three websites which sell same products across three domains, but with a distinct selling approach, price structure, target audience, etc. Does Google take this as spammy or black hat? It is not so bad because the domains are distinguished by layout, selling approach and structure. The most notable statement is that they have only three domains. If you have 300 domains or 3,000 domains, you can get to a large number of domains which could be crowding up the search results. It creates a worst user experience. As you say, you sell same products across three domains. If you were selling like men's clothing on one, and women's clothing on second and kids clothing on third. It is not bad to have different domains for every product. It makes sense when you have small number of domains for very normally separable reasons. But it seems strange to sell the same products on each separate domains. If they appear to be similar, it will look strange for the visitors. Especially when you are about to start more than three domains. If you have one domain, you have got the time to build a reputation for that domain. If you have 50 or 100 domains, you will hesitate to put as much as interest in each individual domain. May be, they will have interest in the beginning, but later they will have the temptation to auto generate content. Or they just try to combine a bunch of feeds. If you visit one domain versus another domain, it seems incredibly cookie cutter comparing the two domains. The users will hate it and start to complain. So one should keep these things in mind, if you are about to go from one domain to multiple domains.
  21. Does Google take action on those websites, which gives back keyword stuff every phone number in the world on its pages, when someone typed a phone number? Actually, it's an 'yes'. Google, usually, gets lot of complaints regarding this. If one has a page, which has only phone numbers on it, there is no value added to it. One will be annoyed when they get several number of pages of those cookie-cutter sites, when they type in a phone number. Google hear complaints from both inside and outside the Google. This happens because Google treats it as keyword stuffing, where one repeats the very similar words right after each other. It is exactly like throwing a dictionary up on the web, where the dictionary is nothing but numbers. And the users really feel irritated. So Google take action against it by considering that web spam.
  22. How does the meaning of underscores and dashes vary in URL? Whenever we see an underscore, Google joins that in URL, instead of separate using that. For an instance, if one say black dash love in a URL, Google considers dash as a separator. So, Google index the word black and Google index the word love. And it is separate. If you were to have First World War with underscores- so First World War- instead of separating it with underscore, gloom it together. So that's one term one could find by searching First underscore World underscore War. That sounds little weird. Reason why Google does it in this way is because they cared about precision. So, when you are a programmer, they wanted to be able to search for the programming terms. That is why because Google joined based on the underscore instead of having that act as a separator. Actually, it doesn't make much difference. It is sort of second order effect. It's not a primary thing, which really makes huge difference. For an instance, Wikipedia has lots of pages which says First underscore World underscore War. That does not keep it from ranking, as it has page rank, proximity and title. If someone is going to start a new site, you have got a blank slate to work with, you must go with dashes at least for the foreseeable future. Google had thought to split on underscores some years ago. But it turns out the amount of impact it has in our rankings is relatively low. So, for now, Google joins on the underscore and separate on the dash. In simple words, if you already have website, which uses underscores and works the way you want, do not rewrite every single URL. If you are about to start a new website, you can opt for the dashes. Otherwise, if you have already decided to do it in underscore, it is not a big deal to worry about.
  23. On what basis, Google change the title, it shows in the search depending on the query? Usually, in case of choosing or deciding which title to show in the search result, Google searches for the concise description of the page, which must be related to the query. It is based on some aspects as follows: * Google tries to find something which is relatively short. * Google needs to have a better description of the page and also for the website that the page is on. * Google should realize it is related to the query in some way. So if the existing HTML title matches these aspects, then most often the default will be to use your title. So, it would exactly describe the page and the site. It would be related to the query. It would be relatively short. Now, if the existing title does not fit these aspects, then a user who types in something, and does not see anything relevant to the query and does not have any idea about the page, is less likely to select it. In these occasions, Google might dig deeper a bit and use content of the page, look at the links that point to the page and incorporate texts from those links, use the help of the Open Directory Project to find out the good title. But one has to remember that Google searches for the best title to help a user assess what they are looking for. If one want to monitor the title being shown, one cannot completely take a power on it, but they can try to expect what a user going to type. Just ensure that the title includes something about the query or the page, type of the site and also some context from the page, so that the user may know what they are going to see when they click it.
  24. Google changed the search engine market in the 90's by evaluating a website's backlinks instead of the content. Will backlinks lose their importance? Actually, backlinks still have several years left in them. Inevitably, Google tries to find out how an expert user would say, this particular page has satisfied their needs. Sometimes, backlinks matter for that. It plays a vital role in determining the reputation of the website. Mostly, visitors see the quality of the content of the particular page they are visiting. So, over time, backlinks will become little less important. If Google can tell who wrote this particular article, say for example Danny Sullivan wrote this article or Vanessa Fox wrote this article, which helps in understanding, it is an expert thing. Google is getting better day by day in understanding actual language, even though we don't know the actual writer. Google is investing in finding out how to do more like Star Trek computer, so conversational search, in which one can talk to the machine and it can understand. To understand what someone is saying, like how is the performance of Apple iPhone 6s and in how many colours it has been launched, it can able to know what it is referring to. He is referring to Apple iPhone 6s. In order to do this well, they must understand the natural language well. As far as Google gets better in understanding who wrote something and what the real meaning of the content is, as over time, importance of links will become less. But, at present, Google will continue to use links to assess the basic reputation of page and of sites.
  25. Does Google have a version of search engine that totally excludes any backlink relevance? In SEO terminology, a back-link is a hyperlink that links from a webpage back to your own Web page. It is otherwise known as an InBound Link (IBL). It helps in determining the popularity of a Web site. Search engines, including Google considers Web sites with more back-links is more relevant in search results pages. And coming back to the question, Google does not have any such version, which is exposed to public. But they run experiments internally with much worse quality. It turns off back-links, in spite of some noises, lot of spam, for the most part are real big win in terms of quality of search results. Google has been played with the idea of turning off backlink relevance. At present, it helps in ensuring Google, that they produce the best, most relevant, most topical set of search result possibilities.