feedburner
Enter your email address:

Delivered by FeedBurner

feedburner count

donderdag 27 november 2008

InLinks paid links service sparks debate

Chris Crum |
Staff Writer WebProNews

And the Gloves Come Off...

Quite a storm of debate has erupted over a new service called
InLinks - essentially a paid text link service that allegedly
makes it hard for Google (and other search engines) to detect them.
And mouths of Internet marketers begin to salivate.

The debate has basically turned into Matt Cutts vs. the "Yeah,
let's stick it to Google" crowd. .As far as I can tell, this
started with TechCrunch reporting on InLinks, which prompted Matt
Cutts to send them an email from which the following is a sample:

Google has been very clear that selling such links that pass
PageRank is a violation of our quality guidelines. Other search
engines have said similar things. The Federal Trade Commission
(FTC) has also given unambiguous guidance on this subject in the
recent PDF where
they said "Consumers who endorse and recommend products on their
blogs or other sites for consideration should do so within the
boundaries set forth in the FTC Guides Concerning Use of
Endorsements and Testimonials in Advertising and the FTC's
guidance on word of mouth marketing," as well as "To date, in
response to this concern, the FTC has advised that search engines
need to disclose clearly and conspicuously if the ranking or other
presentation of search results is a function of paid placement,
and, similarly, that consumers who are paid to engage in word-of-
mouth marketing must disclose that fact to recipients of their
messages.


From Inlinks :

Blogger Benefits



  • Ads that are easy on the eyes. Sell in content ads without the annoying pop up!


  • Predictable revenue. Get paid a flat rate per month per ad sold.


  • Full editorial control. Approve or deny ads as they are sold or allow us to control.


  • Blog friendly. Install our simple plugin and we take care of the rest. Just sit back and collect your monthly earnings.


_
Plugins are available for MovableType, Wordpress and Drupal

my opinion:
as long as it ain't down, it's up, Sparky.

dinsdag 25 november 2008

Spam@$200 a piece : Facebook

Friday, (21-11-08) federal Judge Jeremy Fogel awarded Facebook $873 million in damages against Adam Guerbuez and Atlantis Blue Capital. According to the social network, it is the largest award ever under the Controlling the Assault of Non-Solicited Pornography and Marketing Act (CAN-SPAM).

According to Max Kelly, Facebook’s director of security, the company doesn’t expect to ever collect the money but believes it will be a powerful deterent to anyone who messes with the company or its users.




According to the complaint, Guerbuez sent more than 4 million spam messages to Facebook users between March and April. He allegedly did so by stealing Facebook users' logon details using phishing messages and through data obtained from third parties. He then allegedly bombarded Facebook users' message posting pages, or "walls," with messages from the hijacked accounts of spam recipients' Facebook friends.

The sleazy messages could be viewed by anyone viewing an affected Facebook profile, and appeared to be endorsed by the account owner and the friend who posted it.

"The spam promoted numerous products and Web sites that, on information and belief, are offensive and embarrassing," the complaint explains. "The products marketed by these spam messages included marijuana, male enhancement pills and sexually oriented material."

earthtimes.org

zondag 23 november 2008

Woops

I was programming a mini-site that spawns about 90.000 pages out of the blue, and to test it I put it on a subdomain on a server. I was just starting to normalize the database and program the functions layer, adding some MooTools Ajax stuff and it's starting to look good.

At some point I check the SeoQuake stats bar and I suddenly see "3 indexed pages" and I go "****". I used search engine friendly url rewrites with a domain database table tag as category but as my data is all test-data, half of the tags don't match so if GoogleBot decides to go for it, I get 90.000 screwed up pages full of old incorrect data in the Google index with the wrong urls.

Like /sex/news.google.com/

I don't think Google will appreciate that :) So I put the entire on Disallow in robots.txt, I heard that works. I hope so, otherwise I had better erase the subdomain.