Well the fact of the matter is Surfulater is barely visible on Google (apart from paid ads). So I’ve been been doing some research to try and see how we can rectify this sad state of affairs. It is pretty obvious that if folks can’t easily find Surfulater when they do a Google search, then the chances of them becoming a customer are pretty slim.
The Search Engine Optimization (SEO) Experts tell you there are some basic things you need to do:
- You need to include your search keywords over and over again, preferably in bold, on your home page. This is referred to as keyword density. If the keyword density is too high you’ll get into trouble. If it is too low, your ranking in the results will not be good.
- You need to get links from other sites to your site. The sites that link to you, need to be related in some way to what your site/product is about. Too many links from inappropriate sites will get you into trouble. Too few, will see a poor ranking in the results.
- You mustn’t make any major changes to your site, as we did with the recent Surfulater site makeover. We now have a Google Page Rank of zero, which it has been since the new site was launched, earlier this month. I’m told that eventually this will resolve itself.
- You need to create fresh content on a regular basis. Search engines apparently love fresh content. One trick some sites employ is to acquire articles from various places and publish them on their site. This must be the school of; its quantity (or size) that matters, and not quality.
- You need to have your Page Titles, Meta Tags, Page names and Folder names all set just right.
- You need to watch out for Google Dance and the Google Sandbox. Lots of people talk about the sandbox, but few will tell you anything about.
- Publishing clear, concise, honest and interesting information doesn’t seem to matter much.
During my research I was led back to the Google Information for Webmasters page. Here are some points from the “Quality Guidelines – Basic principles:” section:
- Make pages for users, not for search engines. Don’t deceive your users, or present different content to search engines than you display to users.
- Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
- Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web as your own ranking may be affected adversely by those links.
This is all good stuff, which I totally agree with and follow. And it probably explains why I’m not On Google and my competitors are. Funny how it also contradicts what all the SEO’s are telling us to do, as outlined above.
I guess that as long as I refuse to clutter my pages with repeated bold keywords, artificial text just for search engines and do other SEO gimmicks, I’m doomed to hover around in obscurity.
All I can hope for is the day when we have truly clever search engines that give great results, without forcing Web sites to pander for the search engines, instead of looking after our readers.
Are you sure it’s not this?
http://clsc.net/research/google-302-page-hijack.htm
Help stop this:
http://www.darrinward.com/google302/
No question but that you’re right on most counts. However, I will say that time heals many wounds – that is, it counts for something. I found with toyoland that it took months for Google to list it completely, then give it a PageRank, then actually use the PageRank for searches! on the other hand, metrus.com’s balanced scorecard pages used to be #1 in Google and suddenly dropped down to … someplace in the hundreds. I don’t know why it happened, but I think Google used to really reward long pages, and now they don’t.
Generally I have done very well by Google without any real search engine optimization other than having keywords in headers and in titles. But I have noticed that they often reward TIME. Three to four months for a new site seems to be standard – though there are notable exceptions!
Lawrence, Thanks, but I don’t think my lack of visibility on Google is related to this. Interested issue though.
Dave, thanks for the post. Surfulater.com has been live for quite some time, certainly over 6 months. Before the site makeover went live earlier in May, it had a Google PR of 5. Mind you it has never been visible for relevant keywords.
I need to get more traffic, but I’m not going to spend a heap of my precious time munging pages etc. in an attempt to improve my results.
It is heartening to hear your story. Let’s hope time and patience also pays off for us.
Out of all sites I have created, the ones that works best have little if any javascript, dhtml menus or other advanced html in them. Clean simple and consistent html with a straightforward navigation helps a lot!
surfulater.com seems way too complex for Google.
I design my sites so that all pages are accessible by simple “a href” tags, with menus appearing at different levels down to level 4. Image links are no good, as google doesn’t understand the purpose of the link.
Google really seems to favour this simplistic view.
To add to my comment above, littering your text content with bold tags surely won’t help, and it decreases the value of your text. It gives your site a bit of that “desperate sales person” feel.
Robert, thanks for your comments. You may well be right that Google likes plainer html sites better, but why should we be constrained by Google in the look and feel of our sites. This gets back to my point about needing “smarter” search engines.
The Surfulater home page does include plain “a href” tags to most all of the other pages, but this isn’t the case on other pages and needs to be addressed.
Prior to the recent site makeover, there was much less JS and no DHTML menus. In other words it was a simpler site, however it still ranked poorly, much the same as now. So I think there is more to this than you suggest.
re. Bolding. My understanding is surrounding keywords in bold or italics improves their SE weighting. Also Marketing books suggest the use of bold for important key phrases. Web surfers tend to skim, not read, so this helps draw them to the key messages you are trying to get across. Having said all that, I don’t like the look either.
PHP is a great thing.
It has allowed me to serve useful pages with dynamic menus etc. to human users and static HREF menus to search engines.
This isn’t to “trick” search engines because humans can select to view the pages with the same static menus.
I think this has helped with Google a lot.
I agree with Neville: search engines should just get it right.
http://www.webwombat.com.au is an example of a search engine that gets it right. Only really useful, though, if you want to search Aus and NZ.
And it’s immune to the 302 hijack because it ignores all redirects.
I had a similar frustrating experience. I created a blog in December using Blogger, a Google-owned product. I regularly updated it with well-written, accurate, verbose accounts of my travels over 4 months. Only a couple of weeks ago (which is 5 months after creating the site) did Google even acknowledge the site’s existence!
I notified Google of the URL right at the beginning, and a couple of times later.
Now my site is in Google but not fully indexed, so nobody will find it without the correct URL.
As a developer and regular websurfer I can identify with your frustration on both sides of the coin. It used to be so easy to find reliable information on exactly what we wanted back in the days of Compuserve and such. Now even when we know exactly what we’re looking for it’s tough to get it to come up in an SE. With all the work Google is doing you’d think they’d be looking to offer more extensive search options for those of us that would like to weed out much of the over-commercialization of the Web while searching as well as make sure that content creators have a clear understanding of what needs to be done to assure their site will be indexed and come up where it belongs without having to be linked to big commercial sites. I believe the next generation of search engines are going to be community built with registered members submitting categorized sites that can be trusted (DMOZ type stuff but more streamlined and on a larger scale). I’d even pay membership fees to Google to have options like being able to search just non-commercial review sites, non-commercial informational sites, product company sites, non-commercial sites in general, or any of the plethora of searches I’d like to be able to do without wading through a million merchant sites and scam-type link sites that purposefully mimick indexes of valid sites to get traffic through SEs. It seems like little to ask and I’m sure many would pay for better search services. Anyhow, best of luck with your site and your product!
Just a quick note: You were at the top of the list when I searched for you tonight!