Introduction About Site Map

XML
RSS 2 Feed RSS 2 Feed
Navigation

Main Page | Blog Index

Archive for the ‘Web-based’ Category

The Dangers of an Advertising Monopoly

THERE have been some heated talks recently about the market distribution in the online advertising sector. An observation worth making is the fact that most companies are in the business of making other companies runs out of business, whether deliberately or not.

With the rise of software as a service, many business rely not on acquisition costs and not on subscription for revenue, either. They use advertisiing instead. It appeals to newcomers and facilitates rapid expansion. But what happens when these businesses rely on a middleman for advertising? What happens when the advertiser itself in among those that compete against Web-based services that rely on it?

Ad BlockingSadly, many businesses rely on companies such as Yahoo and Google, which manage their advertising and connect them with the advertiser. Both ends are customers — the advertiser and the service. The middman gains the most. It is hard to compete with companies such as Yahoo and Google when they in fact make pure profit from advertising. It is almost as though any business that uses a middleman for advertising is sharing the revenue with a competitor. The margins simply cannot be compared.

To use an example, if a company uses Yahoo for advertising in its specialised CMS, then Yahoo gets a share of the profits. If Yahoo wanted to compete head-to-head, it would not be subjected to the same third-party ‘taxation’. Therefore, it would find it easier to compete.

With this little load of my mind, perhaps it’s worth adding that advertising will always remain a controversial thing. It is a form of brainwash. Marketing lies.

Divisive Web

InternetAccording to an article that I recently read, the Internet could one day be broken down into separate networks that are isolated and selectively dispersed around the world. This means that the global nature of the Web, as well as the wealth of information, would cease to exist. Moreover, this heralds that final goodbye to a state where little or no censorship barriers can prevail. This changes one’s perspective entirely.

This worrisome move is entirely different from the issue of Net neutrality, which in itself separates the Web into multiple tiers. It is also reminiscent of rumours about ‘Googlenet’, where one submits a site to a dark privatised Web that gets indexed and closely monitored (obviating the need to crawl remote servers and use pings for distant notification).

In the long term, whether this is totally disastrous or not remains to be seen. Consider, for instance, the peculiar extension of resources that are made publicly available. Let’s look a look at the way that the Web has evolved in recent years. Only a tiny crosssection of the ‘visible’ Web involves content spammers (or scrapers), where visibility is grossly defined by search engines (internal sites and intranets aside). However, in reality, the content that exists on the Web–that which is deliverable and which is spam–can actually be a majority (spammers spawn colossal colonies of junk and dummy content). This leads to (or involves) blogalanches, ‘poisoning’ of the index/cache, and it’s subverting search results in the process. All this leads to chaos as search engines diverge from the correct search results and deliver something less meaningful. In the process of struggling for good spots (or visibility) in search engines, spam rises and leads to attacks of various sort. Temptation leads to vandalism, which leads to further maintenance. The Web no longer seems like an appealing place to be. But can division of the Web help? I very much doubt it. It’s all about authorities controlling information. Brainwash is the means for making others think alike, comply, and even be submissive.

Wikis Finally Embraced by Academics

Wiki
The Public Wiki section on this domain

Some good news with regards to collaboration with tomorrow’s technology (pardon the pun).

“The collaborative editorial process of wikis often results in a stunning degree of accuracy. A study by the science journal Nature found Wikipedia nearly as accurate as Encyclopedia Britannica. In fact, for summaries on niche issues and emerging interests, the biggest wiki of them all — Wikipedia — is often the best available source of information.”

It’s about time. I have been doing this for over a year, but people whom I work with refuse to embrace the concept of collaborative editing. They just toss 2 MB E-mail attachments back and fourth, bloating/clogging each other’s inboxes, still unable to spot the actual changes made.

Here is one example . The page is currently locked for editing because it’s a year and a half old, which expires its ‘shelf life’ and justifies guarading against Wiki SPAM.

Squashing Zombie Armies By Moving Server?

Server room

I have just entered a squash competition, which is due to begin in October. I hope I can make a decent run for a change. I tend to lose in the early rounds, judging by previous years. While I’m experienced at tennis, I rarely get the chance to practice squash. Moreover, those who participate are in the competition are rather good in general. They seem to be skilled with the swing and are able to see the game from a different and more advanced perspective. Endurance and strength cannot defeat these qualities.

As a secondary note, my site is likely to be down (offline, to phrase it more gracefully) for 8 hours tonight. The domain is being moved to a newer server, which is definitely good news. Zombie armies are said to have grown in scale quite significantly.

Earlier today I read an article about their impact. It claimed that Windows vulnerabilities have led to a rise of 20%+ in just one week and the implication to a Linux user is more SPAM and more DDOS attacks. With control by proxy there is, quite sadly, no liability. Yesterday I spotted an unidentified bot (probably illegitimate) which devoured half a gigabyte of pages from my site. It’s costing resources and money.

In other news, this morning I submitted a first draft of my thesis. I can finally exhale for a while.

Viral Marketing Accusations

IT is pretty much evident by now that I have joined the Netscape team. I never denied it, nor did I say a word until it was official (and publicly stated).

Plastic troopsI am very much pleased to have gotten an opportunity to work with a group of talented people. Up, close and person, figures whom you were taught to dislike (principally Calacanis) are quite friendly and kind. They are not the devils that you were led to believe they are. What bothers me most are some recent accusations that come from conspiracy theorists. Some would argue that Netscape is trying to ‘poison’ Digg’s index, which is of course preposterous. Netscape would never use destructive measures or viral marketing techniques. To quote what fits the latter catergory (from Wikipedia).

Viral marketing is sometimes used to describe some sorts of Internet-based stealth marketing campaigns, including the use of blogs, seemingly amateur web sites, and other forms of astroturfing to create word of mouth for a new product or service. Often the ultimate goal of viral marketing campaigns is to generate media coverage via “offbeat” stories worth many times more than the campaigning company’s advertising budget.

Blocking Ads – An Example for Digg.com

I already addressed ad blocking controversies several times in the past, the context being slightly different each time, e.g.:

With blocking often comes some guilt, as it is resistance to the developer’s intent (much like modification of source code). But to me, the bottom line has always been that I am never (or rarely) interested in the content of the ads, all of which are promotional and often urge the visitor to expend money (if not merely get exposed to and absorb some brand name). Thus, any click that I am ever likely to make would probably verge the line of click fraud (assuming pay-per-click programs) and this revenue money goes towards many sides. It will most likely pay off authorities through taxation, the advertiser (or mediator, e.g. Google, Yahoo/Overture), and the Webmaster. So, everybody is essentially gaining at the expense of the client of these ads. If you are not genuinely interested in ads, never follow the link. It is probably as unethical (if not more) than the exclusion of ads altogether.

Digg.com is no exception to all of this. It recently seems to have adopted Google ads that are rather stubborn in the sense that most plug-ins or rules cannot hide them well away from sight. When Digg version 3 launched (only a few weeks ago) it took only minutes for me to get irritated by the waste of space at the top — that which is assigned purely for contextual ads from Google. None of my 3 blocking mechanisms in Mozilla Firefox was enough to wipe it off the page, much to my misfortune as I use Digg very heavily and my Web browser is small in terms of its height dimensions (due to kpager at the bottom of the screen). Either way, there was a simple way to suppress those ads, merely by excluding the div in question. This can be done using a simple selector in Firefox, which makes that top banner ad disappear, never to be thought of again.

Find your chrome directory under your main installation:

  • /Applications/Firefox.app/Contents/MacOS/chrome in Mac OS X
  • C:Program FilesMozilla Firefoxhrome in Windows
  • firefox/hrome in your Linux local installation directory

You might also find it convenient to do this on a per-user basis, by locating your chrome settings under individual profiles, e.g. ~/.mozilla/firefox/1ixj9tdk.default/chrome on my SuSE Linux box.

Edit (if already existent) or create a file named userContent.css and add the following selector:

*[src*="banner_ads"] , div[id="top_ad"] { display: none !important;}

This was inspired by Firefox AdBlock, which is far more comprehensive, but appears to be no longer available (it older URI is now broken). You can always append new rules by looking at the source code of pages and add selectors for exclusions, even using simple wildcards.

Users Are Efficient; Neither Stupid, Nor Lazy

WHY is it that so many user interfaces simply fail to work? It’s because users are permitted to take shortcuts and ignore the instructions. This is in fact the message which is delivered by Jeff Veen, whose opinion was inspired by another’s.

Veen concludes: “They’re not stupid. They’re not lazy. Don’t treat them that way.” Users are efficient. They want to get the job done with the least effort. It just doesn’t bode well as far as the intent of the developer is concerned.

Yanoff
New Yanoff for Palm – an example of
poor UI design

Retrieval statistics: 21 queries taking a total of 0.171 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|