The ANSiLove Script used on my Site to display my Text Art galleries ceased working properly yesterday. I have not figured out yet why this is happening.
You will see broken images instead of the Art work at the moment. I hope to find the problem soon and get the gallery working again.
Its a small disappointment after my excitement a few days earlier. I apologize for the inconvenience. I hope you will come back later again to check out my gallery once its up and running.
Carsten aka Roy/SAC
Update 6/1/2006: Problem not found yet. I am still working on it. Please bear with me on this one.
Wednesday, May 31, 2006
Sunday, May 28, 2006
My Art Galleries Launched - Over 700 pieces of Text Art
I came across a nice php script called AnsiLove last week. The Script converts original ANSI and ASCII files to an Image in real-time. The characters are rendered correctly and the DOS Ansi colors are correct as well. The Font used is true to the original MS DOS font. I was delighted.
The Script is also easy to use and I was able to add it to my Website in no time at all.
I spent in the past a lot of time making screen shots of my ANSI's and ASCII's with a Screen Capture Tool and the ANSI/ASCII Viewer ACiDView for Windows (which can be found at my download page). All my sample art on my Roy/SAC Text Artist Homepage and my "deviations" at my deviantARTArtist Page were done this way.
I have to continue using this method for my deviantART pieces for two reasons though. First because I have to upload my Art to their Servers and second am I also adding some comments with background information for each piece of Art. I believe this to be a good thing to do, because it gives the viewer a bit more insight about the times and events surrounding the creation of the Art Work.
Okay, I can't use it for my deviantART stuff, but I was able to put all my 700+ pieces of ASCII and ANSI Art on to my Website with a very small amount of effort. This is great!
When I was working on the Scripts to display my Text Art did I realize, that I can do the same for my VGA Art as well, without AnsiLove.
I finalized the new Pages today and would like you to check them out. I created several Galleries, an ASCII Gallery with all my ASCII Art, an ANSI Gallery with all my ANSI Art, a "Best Of" Gallery with all my favorite Text Art pieces (ANSI and ASCII mixed), a "Latest Text Art" Gallery where I simply copy my recent creations to, a VGA Art Gallery with my SAC VGA Work and finally my Web Graphic Gallery which shows some stuff I did for my Websites.
Here is an Example of how AnsiLove renders an ASCII. The Image is based on an ASCII File and is rendered in real-time. Great, isn't it?
The Script is also easy to use and I was able to add it to my Website in no time at all.
I spent in the past a lot of time making screen shots of my ANSI's and ASCII's with a Screen Capture Tool and the ANSI/ASCII Viewer ACiDView for Windows (which can be found at my download page). All my sample art on my Roy/SAC Text Artist Homepage and my "deviations" at my deviantARTArtist Page were done this way.
I have to continue using this method for my deviantART pieces for two reasons though. First because I have to upload my Art to their Servers and second am I also adding some comments with background information for each piece of Art. I believe this to be a good thing to do, because it gives the viewer a bit more insight about the times and events surrounding the creation of the Art Work.
Okay, I can't use it for my deviantART stuff, but I was able to put all my 700+ pieces of ASCII and ANSI Art on to my Website with a very small amount of effort. This is great!
When I was working on the Scripts to display my Text Art did I realize, that I can do the same for my VGA Art as well, without AnsiLove.
I finalized the new Pages today and would like you to check them out. I created several Galleries, an ASCII Gallery with all my ASCII Art, an ANSI Gallery with all my ANSI Art, a "Best Of" Gallery with all my favorite Text Art pieces (ANSI and ASCII mixed), a "Latest Text Art" Gallery where I simply copy my recent creations to, a VGA Art Gallery with my SAC VGA Work and finally my Web Graphic Gallery which shows some stuff I did for my Websites.
Here is an Example of how AnsiLove renders an ASCII. The Image is based on an ASCII File and is rendered in real-time. Great, isn't it?
Friday, May 26, 2006
Are you ready for '' CJ Analytics''?
I didn't see the simple kept email from Commission Junction which was sent on Tuesday night at first. It looked not important in the bulk of emails in my Inbox on Wednesday morning. I saw the announcement first at thread watch. The Threadwatch Post refers to Jeff Molander's view on the topic and to Scott's Jangro's Blog Post which was very good in pointing out some of the major issues with CJ's idea. I decided to leave a long comment there and showed even more problems with CJ's plan to replace regular URL links with AdSense-Like or Google Analytics-Like JavaScript code. This post does repeat some of my comments, but from a slight different perspective. I also refer to statements made there a couple times without repeating them here.
I recommend to read all the posts and comments and also check the hundreds of posts at the ABestWeb Marketing Forum and of course the source of all this trouble, CJ's email announcement of their Link Management Initative (you do require a CJ Advertiser or Publisher Account for access).
The Plan
The CJ JavaScript Ad's can not be used in all cases. The option to get a different URL for email and another for PPC Advertising and another for product feeds is not going to work in all real-world scenarios either
It will get really ugly, if CJ is going to disable referrer tracking for a link that was created for one purpose, but is being used for a different one by the publisher.
Here is a real-live Example of the problems. Our Shopping Portal Site is a perfect example of all the things that are not going to work or at least not going to work well/better than existing Text or Banner Ads.
We have pretty much all of our content in a Database (MS SQL Server) and a custom CMS solution to update our Website content. The content is displayed on our Websites, but it is also being syndicated via RSS feeds, and pepped up by Feedburner.
Ever heart about Syndication and Aggregation?
If you are familiar with Feedburner, you will know that their "Smart feeds" can be used in a lot of different ways. They can be browsed like a website, imported into Blog/RSS Directories and Search Engines and of course being aggregated by all kinds of readers and services such as Feedblitz.com which aggregates RSS feeds and sends the aggregated content conveniently via email to their customers. We use the professional cobranding feature of Feedblitz for our Email Deals Alerts Feature Now add Google Co-Op to the mix to make the biggest organic and PPC search engine part of the dilemma.
Based on CJ's definition would I require at least 3 different links for the same content. JavaScript Links for our Website, the Feedburner Feed "Site" and Web based RSS Readers such as Google Reader, Bloglines or My Yahoo!.
When it gets aggregated by Feedblitz, Newsgator or similar RSS via Email Services the Email Link should be used because the user is clicking from his email client. There is virtually no difference to any other email newsletter you would send to the user, if he would be one of your opt-in email newsletter subscriber.
When used for Blog/RSS Search engines or special Services like Google Personalized an Google Co-Op, the Search Engine Link must be used, especially for Google Co-Op which behaves just like PPC ad's with the difference that the User subscribed to your content and wants to get your Ads/Content displayed for matching keywords.
Unless CJ develops a self reconfiguring link or link morph system that changes form, structure and content depending on its current use. The whole Web 2.0 thing seems to be something that stopped for CJ at AJAX, ignoring the fact that everything is working on breaking down the barriers between the different content delivery methods and technologies to give the user the personal choice to get the content he wants, when he wants it and in the format/way he wants it. This is totally conflicting CJ's approach with their JS Links.
I wrote in detail what other ramification this change would have for our Website, ignoring RSS Feeds, Blogs, XML etc. In short, to make it work for us, a complete re-write of our CMS System, Public Site, Tracking and Reporting would be required. The partial recreation of all our content would also be necessary. Everything we built during the last 5 1/2 years would be affected, except our "new" Blogger Blog.
Are you ready for "CJ Analytics"?
I am also not sure, if I want to have interactive AJAX powered JavaScript Code from CJ on virtually all pages of my Website. CJ would get full control and complete access to everything that makes our Website a Website. I am not sure, if I am willing to entrust CJ all this highly sensible data. It is the same problem as with Google Analytics, which we are using today.
I still have a strange feeling, when I think about the data Google collects, but Google is at least much less likely to take advantage of the data and use it against us or boot us out of a lucrative business by adapting our secret tricks and techniques.
I am not so sure about that when it comes to CJ. What is tracked? Who has access to the Data. With Google analytics are we at least getting some very comprehensive statistics and analytics tools in exchange for the access to the data. I don't think that CJ will make SEO and SEM and Funnel, Goals, Drop out and Conversion data which I am sure CJ will collect to a certain extend freely available to the publisher. You probably also remember the issue from earlier this year, when CJ Employees with access to sensible and very profitable publisher information and data turned affiliate and left the network. And they had only access to the very limited amount of data available back then. With the JS Code on publisher sites will CJ be able to see and know and track everything as good or even better than the Website owner himself.
Conclusion.
Dynamic JavaScript Ads are nice and a valuable ADD-ON to the existing linking methods currently provided to the publishers. Unless CJ Changed their business model to become a Blog Ads or Google AdSense or TextLinkAds, JS Link will not be able to replace a lot of the existing URL Text Links.
Look at Linkshares DRM and Be-Free's dynamic Ads (BestBuy.com has good examples). There you can see very good and useful examples of using JavaScript for specific types of Ads which would not be possible using the classic technologies. We use those DRM's and Dynamic Ads on our Site for some very specific purposes where they do a better job than Image Ads, Text Ads or Text Links.
History tends to repeat itself
That is the right approach. I have still some memories of CJ's change of their product linking and redesign the CJ Marketplace (Product Catalog) at the end of 2002. I made some very detailed comments as along with a lot of other affiliates and merchants which were for the most part very unhappy with CJ's change.
CJ moved forward and ignored most of the feedback which was very specific and not just generic bitching and yelling. CJ's decision resulted for us in the removal of any CJ Product Links from our sites for about 3 years, until we had the necessary hardware to kick around several gigabyte of text data at ones and developed a complete CJ Feed watching, analyzing, pre-processing and updating system.
I have the feeling that CJ will move forward as planned and will start to prepare us as much as possible. This includes looking for CJ Merchant Programs at other Networks, such as Performics which seems to be favored by a lot of merchants. I see a lot that have a program at both networks. It used to be the other way around. We always replaced any other network or even in-house program with CJ's, if it was available. There are only a few exceptions to this rule.
Uncertain Future
This will push a lot of publishers out of CJ and look somewhere else, but I am sure that it will also attract different types of new Affiliates to CJ. I would not be surprised if some contextual Advertisers that use AdSense and Yahoo Publisher Network and similar services today will take notices and might check it out.
Hard to tell if this will be better for CJ in the future or not. I am sure that CJ was thinking about the Data feed Affiliates and will probably launch officially their new Web service for Products.
It will become a feasible replacement option for some types of Data feed affiliates, but certainly not for all. Brook Shaaf and Nate Griffin read my "scrap" about all that stuff via email. I think I need to expedite my content processing and put it at least as "scrap" on my Website, the same way as I did it for the Merchant Datafeeds for Affiliates 101 Scrap. I am sure, that Web services will become mainstream topic probably later this year, but latest beginning of next year.
Closing Words
I hope to be proven wrong, and CJ will change their plans to make the use of JavaScript Link Code optional and continues to provide classic URL Text Links and Images; seriously, but I hoped that to happen in the past too and was unfortunately never proven wrong so far. This post is long enough now. Feel free to leave your comments and thoughts about this.
Sunday, May 21, 2006
Site Cleanup and Re-Structure and a New Site launch
Although my Shopping Portal ConsumerMatch.com keeps me busy with a gazillion outstanding things on my to-do list, did I take the time to take care of some outstanding things for my familiy domain Cumbrowski.com.
In addition to some design fixed did I split the content of 2 pages across 11 Pages now. The big Roy/SAC Page is now 3 Pages. The Downloads and Links and the Article about the three styles of the underground text art scene are now Pages on their own making the content easier accessible and also better to link to.
The biggest change was my Professional Homepage. It had a huge Resource Section and a whole block for my activities in Article Writing which was becoming way too much for a single Page. The Split of my prefessional Homepage resulted in a total of 8 pages now.
The professional Homepage concentrates about me and what I am doing. I also kept the Industry Events with upcoming Dates there. My Articles and "Scraps" got their own page which is growing as I continue to produce new Articles and other content.
The Resources became 6 new Pages. For each individual topic did I create a separate Page. The Topic are: Internet Marketing, Affiliate Marketing, Search Engine Optimization SEO, Search Engine Marketing (SEM)/PPC Marketing, Website Development and last but not least Database Development.
The Pages are not 100% finished yet. I have to restructure most of them and also add additional rsources which I did not add when all the Resources were crammed on a single Page. You should definitely check them out, because even in their current state will you find tons of resources which you will not easily find in one place anywhere else.
But I did work on Stuff for ConsumerMatch.com as well. We launched last week our niche target co-branded Dating Service "Gothic Match". For the young men and girls that are attracted to dark clothes with a natural dislike of sunlight :).
I like Industrial Music and EBM, which is "border line" to Gothic. My favorite Bands are VNV Nation, Depeche Mode, Project Pitchfork, Rammstein and Blutengel so go figure. If you are also into this kind of stuff or a full blooded Goth, check it out.
Cheers,
Carsten aka Roy/SAC
In addition to some design fixed did I split the content of 2 pages across 11 Pages now. The big Roy/SAC Page is now 3 Pages. The Downloads and Links and the Article about the three styles of the underground text art scene are now Pages on their own making the content easier accessible and also better to link to.
The biggest change was my Professional Homepage. It had a huge Resource Section and a whole block for my activities in Article Writing which was becoming way too much for a single Page. The Split of my prefessional Homepage resulted in a total of 8 pages now.
The professional Homepage concentrates about me and what I am doing. I also kept the Industry Events with upcoming Dates there. My Articles and "Scraps" got their own page which is growing as I continue to produce new Articles and other content.
The Resources became 6 new Pages. For each individual topic did I create a separate Page. The Topic are: Internet Marketing, Affiliate Marketing, Search Engine Optimization SEO, Search Engine Marketing (SEM)/PPC Marketing, Website Development and last but not least Database Development.
The Pages are not 100% finished yet. I have to restructure most of them and also add additional rsources which I did not add when all the Resources were crammed on a single Page. You should definitely check them out, because even in their current state will you find tons of resources which you will not easily find in one place anywhere else.
But I did work on Stuff for ConsumerMatch.com as well. We launched last week our niche target co-branded Dating Service "Gothic Match". For the young men and girls that are attracted to dark clothes with a natural dislike of sunlight :).
I like Industrial Music and EBM, which is "border line" to Gothic. My favorite Bands are VNV Nation, Depeche Mode, Project Pitchfork, Rammstein and Blutengel so go figure. If you are also into this kind of stuff or a full blooded Goth, check it out.
Cheers,
Carsten aka Roy/SAC
Wednesday, May 17, 2006
Google Notebook launched.
People bloged about it for a while already, secret Screenshots and Video Presentations made homepage news on sites like digg.com. The waiting came finally to an end. Google's new and highly anticipated service Google Notebook was launched less than 24 hours ago and is now open to the general public. Free of charge of course like all the other Google Services.
I seized the opportunity and signed up right away because I still remember what happened with other popular Google Services like Google Analytics or Google Pages shortly after they launched.
The run at the new service was high, people signed up like crazy and Google denied access to the service for new customers with the option to get added to a waiting list and get access months later.
I created 3 public Notebooks and will continue to play around with this new gadget for a bit to find an actual (practical) use for it.
If you want to have a peek, visit My public Google Notebook. It is nice, that you can share your Notebook with the public. What I miss is the option to share it only with friends (that also have a Google Account). I also would like to see the option to give others the right to add comments to my notes.
The option to allow or disallow comments and who is allowed to comment, should be an option per Note and also per Notebook.
Well, it's a beta, isn't it?
We will see how it develop and how people are going to use it. Nothing more than that will determine the future development of this new Google Service.
Cheers,
Roy/SAC aka Carsten Cumbrowski
I seized the opportunity and signed up right away because I still remember what happened with other popular Google Services like Google Analytics or Google Pages shortly after they launched.
The run at the new service was high, people signed up like crazy and Google denied access to the service for new customers with the option to get added to a waiting list and get access months later.
I created 3 public Notebooks and will continue to play around with this new gadget for a bit to find an actual (practical) use for it.
If you want to have a peek, visit My public Google Notebook. It is nice, that you can share your Notebook with the public. What I miss is the option to share it only with friends (that also have a Google Account). I also would like to see the option to give others the right to add comments to my notes.
The option to allow or disallow comments and who is allowed to comment, should be an option per Note and also per Notebook.
Well, it's a beta, isn't it?
We will see how it develop and how people are going to use it. Nothing more than that will determine the future development of this new Google Service.
Cheers,
Roy/SAC aka Carsten Cumbrowski
Thursday, May 04, 2006
Google issues getting worse and worse
If probably heard in the News or from an internet savvy friend about Google's Big Daddy update and the mounting problems that developed since the update was started at the beginning of February. My business site, the comparison shopping portal ConsumerMatch.com was affected by the Google problems as were hundreds of others. It have been very frustrating months for everybody who got "hit" by this.
The worst thing of all is, that there is nothing we can do about it, but wait and hope that Google gets it's act straight and that everything will be back to normal rather sooner than later.
It started last summer with Google's Big 3 Phase Jagger Update. The goal of Jagger was to detect spam and duplicate content better and cleanse the Google Search Index from millions of pages of junk. The Jagger update caused serious problems for our Websites. We were not the only sites that got problems. A lot of other sites. Google was able to remove a lot of Spam from their Index, but removed together with the Spam a lot of legit and valuable content.
The last months of 2005 were pretty much like a roller coaster ride for us: Added to the Index, Loosing Pages, Removed, Appearing, Growing and the same from the beginning again. To make the story short; our Google traffic during the peak of the holidays was virtually nothing.
Google Engineer Matt Cutts posted on 10/19/2005 at his blog what Webmasters should do, who's websites dropped out of the index. Article: Update Jagger: Contacting Google. I did not see the necessity for us to contact Google regarding our website yet. I started working on possible solutions for the probable Problems of our Website instead.
I believe that the problem with our site was caused by duplicate content. We had at the time multiple websites on multiple domains with similar content. Our old Site Shop-Links.net, which was started in 2001 and our current site ConsumerMatch.com (where we also own the .Net Domain).
Matt Cutts published in December and January posts regarding the changes in Google's logic and what Webmasters should and should not do. To address canonicalization, Webmasters should implement 301 Redirects from the duplicate content to the original version. Also all 302 redirects should be removed from the site since 302 redirects were used by Spammers in the past to avoid an accidental penalty from Google. See:
The Little 301 That Could, 12/21/2005 and discussing 302 redirects, 1/4/2006
Jagger wrapped up by about November 18th 2005 and was the actual logic change. Google was now getting ready for the Bigdaddy infrastructure upgrade which is supposed to work better for canonicalization and redirects.
We had our 301 redirects in place when Bigdaddy started and everything was looking good at the beginning beginning of February and the Webmaster Community was praising the search results returned by the Test Datacenters. Google got ready to push it out to all Datacenters. See post by Danny Sullivan at Search Engine Watch on 2/2/2006: Google Bigdaddy Search Infrastructure To Rollout More Broadly.
The Rollout was performed throughout February and March. One Datacenter after another was upgraded with the new BigDaddy Infrastructure and Bot. In the middle of February did already News appear about unexpected Issues. Known and new Issues are reported until this day and a solution is not in sight.
I collected some News of the previous months, that will give you some pretty good ideas about the issues.
The Register reported today
Full-up Google choking on web spam?
Zoomzoom Marketeers blogged last Friday
Having problems with Google indexing?
Newcybertech Weblog reported on 4/8/2006
Google Algorithm Problems
So far was nothing reported from the Googleplex in California that provides any information and facts to understand what's going on. I don't know if it is so quite because Google Engineers are working on a solution for the problem. I hope that this is the reason for the silence.
Let's hope for the best for everybody. Google's reputation is seriously suffering at the moment because of this.
During the last 2 weeks were a lot of people working overtime on their website for the mentioned reasons. Others were busy investigating the issue in general to determine what the heck happened.
At the same time interesting articles appeared that talked about recent discoveries and developments in search engine behavior and ranking logic.
The logic and visible behavior of the Google Search Engine and it's Crawlers showed some remarkable behavior (positive ones and not related to the BigDaddy Issues). The conclusions made from the observations show a trend in the Search Industry that will have major consequences for a lot of Websites in the future. How near that Future is, is hard do determine at this point.
Anyway, the changes are so severe that what you learned about Text Book Search Engine Optimization (SEO) during the last 10 years might become obsolete very soon.
It started with Mike Grehan's post at ClickZ on 4/17/2006 with the title: Does Textbook SEO Really Work Anymore?. Mike Grehan is a renowned Veteran of the CEO industry and CEO of Smart Interactive Ltd..
This stirred up quite a lot of controversy and responses and eventually caused Mike Grehan to post another follow up article at ClickZ.com on 5/1/2006 with the title: Does Textbook SEO Really Work Anymore?, Redux. The Examples Mike presented about high ranking pages that were not even crawled yet and high ranking pages where the search term can't be found anywhere within the page at all (no, not a cloaked site).
I did realize after finish reading the Articles, that the current Issues because of Jagger and BigDaddy will seem like minor issues for some in the not so distant future, if Mike Grehan is only to 50% right about his predictions.
The last 12 month were already turbulent. It seems that this was only a Taste or Appetizer for the things to come in the next 12 months.
How the Web being called after that? Web 2.1? Web 3.0? Web 3.0 Beta?
I wish sometimes to be a fortune teller. Just to be able to get an answer to some of those questions. Unfortunately I am not, like most other people and simply have to adapt to the new Environment, react to sudden changes and be prepared as much as possible to be able to make the best out of the new situations and resulting opportunities.
The worst thing of all is, that there is nothing we can do about it, but wait and hope that Google gets it's act straight and that everything will be back to normal rather sooner than later.
It started last summer with Google's Big 3 Phase Jagger Update. The goal of Jagger was to detect spam and duplicate content better and cleanse the Google Search Index from millions of pages of junk. The Jagger update caused serious problems for our Websites. We were not the only sites that got problems. A lot of other sites. Google was able to remove a lot of Spam from their Index, but removed together with the Spam a lot of legit and valuable content.
The last months of 2005 were pretty much like a roller coaster ride for us: Added to the Index, Loosing Pages, Removed, Appearing, Growing and the same from the beginning again. To make the story short; our Google traffic during the peak of the holidays was virtually nothing.
Google Engineer Matt Cutts posted on 10/19/2005 at his blog what Webmasters should do, who's websites dropped out of the index. Article: Update Jagger: Contacting Google. I did not see the necessity for us to contact Google regarding our website yet. I started working on possible solutions for the probable Problems of our Website instead.
I believe that the problem with our site was caused by duplicate content. We had at the time multiple websites on multiple domains with similar content. Our old Site Shop-Links.net, which was started in 2001 and our current site ConsumerMatch.com (where we also own the .Net Domain).
Matt Cutts published in December and January posts regarding the changes in Google's logic and what Webmasters should and should not do. To address canonicalization, Webmasters should implement 301 Redirects from the duplicate content to the original version. Also all 302 redirects should be removed from the site since 302 redirects were used by Spammers in the past to avoid an accidental penalty from Google. See:
The Little 301 That Could, 12/21/2005 and discussing 302 redirects, 1/4/2006
Jagger wrapped up by about November 18th 2005 and was the actual logic change. Google was now getting ready for the Bigdaddy infrastructure upgrade which is supposed to work better for canonicalization and redirects.
We had our 301 redirects in place when Bigdaddy started and everything was looking good at the beginning beginning of February and the Webmaster Community was praising the search results returned by the Test Datacenters. Google got ready to push it out to all Datacenters. See post by Danny Sullivan at Search Engine Watch on 2/2/2006: Google Bigdaddy Search Infrastructure To Rollout More Broadly.
The Rollout was performed throughout February and March. One Datacenter after another was upgraded with the new BigDaddy Infrastructure and Bot. In the middle of February did already News appear about unexpected Issues. Known and new Issues are reported until this day and a solution is not in sight.
I collected some News of the previous months, that will give you some pretty good ideas about the issues.
The Register reported today
Full-up Google choking on web spam?
"Webmasters have been seething at Google since it introduced its 'Big Daddy' update in January, the biggest revision to the way its search engine operates for years. Alarm usually accompanies changes to Google's algorithms, as the new rankings can cause websites to be demoted, or disappear entirely. But four months on from the introduction of "Big Daddy," it's clear that the problem is more serious than any previous revision - and it's getting worse."
Zoomzoom Marketeers blogged last Friday
Having problems with Google indexing?
"If you haven't experienced problems with Google dropping the amount of indexed pages on your site, you are certainly one of the lucky ones."
Newcybertech Weblog reported on 4/8/2006
Google Algorithm Problems
"Have you noticed anything different with Google lately? The Webmaster community certainly has, and if recent talk on several search engine optimization (SEO) forums is an indicator, Webmasters are very frustrated"
So far was nothing reported from the Googleplex in California that provides any information and facts to understand what's going on. I don't know if it is so quite because Google Engineers are working on a solution for the problem. I hope that this is the reason for the silence.
Let's hope for the best for everybody. Google's reputation is seriously suffering at the moment because of this.
During the last 2 weeks were a lot of people working overtime on their website for the mentioned reasons. Others were busy investigating the issue in general to determine what the heck happened.
At the same time interesting articles appeared that talked about recent discoveries and developments in search engine behavior and ranking logic.
The logic and visible behavior of the Google Search Engine and it's Crawlers showed some remarkable behavior (positive ones and not related to the BigDaddy Issues). The conclusions made from the observations show a trend in the Search Industry that will have major consequences for a lot of Websites in the future. How near that Future is, is hard do determine at this point.
Anyway, the changes are so severe that what you learned about Text Book Search Engine Optimization (SEO) during the last 10 years might become obsolete very soon.
It started with Mike Grehan's post at ClickZ on 4/17/2006 with the title: Does Textbook SEO Really Work Anymore?. Mike Grehan is a renowned Veteran of the CEO industry and CEO of Smart Interactive Ltd..
This stirred up quite a lot of controversy and responses and eventually caused Mike Grehan to post another follow up article at ClickZ.com on 5/1/2006 with the title: Does Textbook SEO Really Work Anymore?, Redux. The Examples Mike presented about high ranking pages that were not even crawled yet and high ranking pages where the search term can't be found anywhere within the page at all (no, not a cloaked site).
I did realize after finish reading the Articles, that the current Issues because of Jagger and BigDaddy will seem like minor issues for some in the not so distant future, if Mike Grehan is only to 50% right about his predictions.
The last 12 month were already turbulent. It seems that this was only a Taste or Appetizer for the things to come in the next 12 months.
How the Web being called after that? Web 2.1? Web 3.0? Web 3.0 Beta?
I wish sometimes to be a fortune teller. Just to be able to get an answer to some of those questions. Unfortunately I am not, like most other people and simply have to adapt to the new Environment, react to sudden changes and be prepared as much as possible to be able to make the best out of the new situations and resulting opportunities.
Subscribe to:
Posts (Atom)