Introduction About Site Map

XML
RSS 2 Feed RSS 2 Feed
Navigation

Main Page | Blog Index

Archive for April, 2012

Thoughts on Privacy on the Web

Cookies and cross-site connections help track Internet users in ways far worse than most people realise. People assume that when they visit a particular site then it is this site alone which knows about them. Moreover, they assume that they are logged off and thus offer no identifying details. In reality, things are vastly different and it is much worse when public service sites act as “traps” that jeopardise privacy. A site that I recently looked at (as part of my job) does seem to comply with some of the basic rules, but new advisories are quite strict. To quote: “The UK government has revised the Privacy and Electronic Communications Regulations, which came into force in the UK on 26 May, to address new EU requirements. The Regulations make clear that UK businesses and organisations running websites in the UK need to get consent from visitors to their websites in order to store cookies on users’ computers.”

The BBC coverage of this indicates that “[t]he law says that sites must provide “clear and comprehensive” information about the use of cookies…”

Regulating cookies is not enough. ISPs too can store data about the Web surfer and, as Phorm taught us, they sometimes do. They sell information about people.

In more and more public sites, HTTPS/SSL is supported and cookies remain within the domain that is “root” in the sense that the visitors intended to visit only this one domain (despite some external bits like Twitter timelines in the sidebars/front page. Loading up Twitter.com, even via an API, might help a third party track identities). Shown in the following image is the large number of cookies used when one accesses pages from Google/GMail (even without having a GMail account).

Cookies

Although SSL is now an integral part of this service (since the security breaches that Windows caused), privacy is not assured here. Although they don’t swap cookies across domain visitors, Google’s folks do track the user a great deal and they have many cookies in place (with distant expiry date) to work with.

Information on how Google will use cookies is hard to obtain, and the problem is of course not unique to Google cookies. Most web browsers automatically accept cookies, so it is safe to assume that about 99% of people (or more) will just accept this situation by default. If a site had provided visitors information about cookies, permitted secure connections (secure to a man in the middle) and not shared information about its visitors, contrary to the EU Commission which foolishly wanted to put spyware (Google Analytics) in pages, then there is at least indication of desire to adhere to best practices.

Cookies are not malicious by design as they are necessary for particular features, but to keep people in the dark about the impact of cookies on privacy is to merely assume that visitors don’t care and won’t care about the matter. And that would be arrogant.

To make some further recommendations, privacy should be preserved by limiting the number of direct connection to other sites. Recently, I have been checking the source of some pages to see if there’s any HotLinking that’s unnecessary in public sites, which would be a privacy offense in the sense that it leave visitors’ footprints on another site. Outbound links can help tracking, but only upon clicking. The bigger issues are things like embedded objects that invoke other sites like YouTube. HotLinking, unlike Adobe Trash, cannot result in quite the same degree of spying (Google knows about IP address and individual people). If all files can be copied locally, then the problem is resolved. Who operates linked sites anyway? If it’s a partner of a sister site, then storing files remotely might be fine, but with AWS growing in popularity, Amazon now tracks a lot of sites, e.g. through image hosting.

Sites like Google, Facebook (FB) and Twitter, if linked or embedded onto a Web page, can end up taking a look at who’s online at the site. All it takes from the visitor is the loading of a page, any page for that matter. FB is often criticised for the “like” button too (spyware). JavaScript (JS) has made the spying harder to keep track of; it would be best practice to perhaps offer JS-free pages by default, which limits viewing by a third party assuming those scripts invoke something external. Magpie RSS can help cache copies of remote data locally and then deliver that to the visitor without the visitor having to contact another server when loading up the primary target site. Some sites these days have you contact over 10 different domains per pageload. It’s the downside of mashup, and it extends to particular browser components too (those which “phone home”, but the user usually had more control over them than over known and unpredictable page source). Google and Microsoft uses their cookie to track people at both levels – browser and in-page (sometimes under the guise of “security”, babysitting and warning about “bad” sites you visit). Facebook and Twitter only do the latter and a lot of people don’t welcome that. Facebook, notoriously, profiles people (e.g. are they closeted gay? Is there fertility/erectile dysfunction? Any illnesses the person obsesses over?) and then sells this data to marketing firms and partners, reportedly Microsoft too.

Public sites have different regulations applied to them because many people are required to visit them (e.g. paying taxes), it is not a choice, not to mention the sovereignty principles (e.g. should Google know who and when and how European citizens access their government sites which they themselves paid for?).

In society there is a lot of ransom going on — a lot of ransom people do not regonise or will never be known or reported. This relies primarily in information, unless there is a physical hostage situation (where the prison is at danger of mortal harm). But the bottom line is, those who have the potential to embarrass others possess a lot of power, so there is a fundamental issue of civil liberties at stake. This is why, among several reasons, the TSA agents stripping off (literally or figuratively, or in scanner) is a way of dehumanising and thus weakening the population, normalising indecency and maybe returning us to memories of some human tragedies. The privacy people have is tied to their indignity, worth, and sense of self/mutual respect. Privacy is not a luxury; it is an important tenet of society. Society will suffer if privacy is altogether lost.

Keeping Web-based Software Updated

One of the problems that’s leading to the cracking of many Web sites is that software is not kept up to date. It is not an easy task unless the process is made simple and at times automatic because people are adverse to change and to risk (associated with updating software, never mind the risk of getting cracked). Keeping abreast of security fixes and new upgrades for Web-based software is not easy unless one uses an operating system like Debian, which can be updated regularly and has strict requirements for inclusion. There are several points worth making here:

1. Some CMSs are better equipped for this type of scenario. In my ~15 domains I have a dozen of so different CMSs and some are antiquated, e.g. php-nuke, and depend upon updates coming upstream, e.g. php-bb with the infamous uploader hole (~2008). Other software, such as WordPress (it’s my favourite as I was also part of the devs community for many years), alerts all users about the need to update the software. They keep up appearance by reducing the number of reports of cracked sites.

2. In recent years people have been using scripts like Fantastico-packaged set to install the software. Softaculous is another one. 3 days ago WordPress issued a security fix (local privilege escalation and XSS for the most part, not too critical for some site setups), which automatically sent me several E-mails like the following (from domains where I used Softaculous to set things up):

 

"The following script updates are available:

WordPress 3.3.2:
[omitted]

To upgrade these scripts go to your Control Panel -> Softaculous -> Installations.
There you will be able to update the scripts.

>From Softaculous Cron Jobs  ([IP removed])"

Each bit of software typically keeps administrators abreast of security holes, but some software does not do this. WordPress alerts even writers, urging them to contact their admin for updates. Other bits of software require that one subscribes to a mailing list or regularly checks for updates. Back in the old days, and the way MediaWiki still works for the time being, people are advised to subscribe to a mailing list (or blog) with announcements about security fixes. If many customers have Joomla sites, then it’s useful to be subscribed to such fora and then update everything for everyone in batch mode (for WordPress I need to update 8 sites each time a fix comes out, and for some I need to do this manually from the shell due to different server settings).

It helps to have a database of installed software, recording which server is running which piece of software. It would be surprising if no such listed had already been compiled by those who operate many servers. It helps know what can be updated at the same time by the same person with the same files.

Some updates are merely about new features and might not even be backward compatible. Some software, like WordPress 2.0, is LTS (for inclusion in Debian stable), so it’s unlikely to require much updating. So, one can just look at what has changed and only update if the update is security related or has a data-jeopardising bug (in WordPress 3.2, for instance, people who rushed to update not for security reasons merely suffered from bugs and then had to update again to 3.2.1).

Panda Sneeze

Still Doing FOSS (Free/Open Source Software)

ALTHOUGH it’s premature to make predictions or statements of intent, it seems possible that I will add to my existing jobs and in the process move down south near London. I was on vacation there a couple of years back, but otherwise it’s just a place that I visit every 2-3 months. One employer of mine is based there, so I might move there permanently.

Techrights is still my love and passion, so I have no plan to cease promoting Free/open source software (FOSS) in that site. In all my jobs I deal with FOSS and all code that I produce is — as ever before — free for inspection, reuse, and sharing. FOSS encourages better coding and it improves solidarity in our society.

Protests Spread to Canada

Failed Attempts to Use Pattern Recognition Methods for Points Selection

We (myself and colleagues, especially myself) are unable to get good performance by edge detection and other ordinary means, especially because in 3-D things are more fuzzy. We will try normals again in the coming days.

I have been trying long and hard to vary parameters and tinker with what’s available in the image in order to place points at edges and corners, but the performance is never better than when ICP is used to put the points at the same location on the grid (after alignment by ICP of course).

Image 1:

Image 2:

I have been working to compare and assess how using inner circles/rings would improve our older implementation which mostly judges similarity based on outer circles/rings. This test can be improved a lot by choosing more points, but it will take time to get the results.

Image 3:

Image 4:

Departing back to older methods which use one single criterion for verification and also use coarse sampling of the surface, performance is not encouraging; but the next step will to be an attempt to use surface normals or another bit of information which can be derived from the surface alone.

Image 1: Results of a coarse approach tested on ~400 pairs.

Image 2: Example of a pair of images.

The Mail Services Deserve to be Out of Service

Royal Mail logo

THE SNAIL MAIL services are having a hard time. They are becoming obsolete and rather than improve in order to compete they have only become overpriced and they always underperform. It’s not just Royal Mail, this is a universal problem. Breaking up with the mail system altogether is hard for obvious reasons; quiting E-mail because it is not so reliable [1, 2] has the same issues because particular companies, utilities for example, insist on using postboxes (although some are moving towards E-mail, which isn’t much better).

The mail services are not trying to deliver as they should, so essentially they quit even showing what makes them more reliable than their competition. Why even bother with them? A lot of stuff can be shipped in alternative, more reliable means, even reproduced by printing (notably in forms that can be dealt with digitally or maybe scanned after a manual process).

The level of disturbance or even agony due to lost/missing mail is very high. In the majority of cases it seems to be happening or seems to be ending up as a failure. It doesn’t have to be like this. I am a resident at a place near the Town Hall/Centre (nothing difficult to find or reach) and I found it nearly impossible to receive packages, even when I am at home waiting by the intercom. Over the past year or so only 2 out of about 8 registered parcels sent to this address actually reached me. The post people don’t buzz me (I work at home) and eventually they return important packages to the sender or keep them at the warehouse. It is not possible to contact the post office to get the packages in alternative means, not even a note is left to suggest one does this. This is becoming a systematic and very serious issue and attempts to reach Royal Mail by phone are pretty pointless; it is exceedingly hard to find the phone numbers (they hide those) and once a number is finally dialed a person is hard to reach; they try to just automate an already-defunct system. Over the phone, one needs make sure s/he got the number/s and code/s right because their voice recognition is terrible, leaving one just wasting time talking to a robot and navigating through voice menus.

After a lot of struggle, I did mange to actually speak to a person from the post office (yes, they exist! And it’s not outsourced!). I spoke to an agent at Royal Mail, the national (British) mail service, who also kindly checked the Parcel Force tracker just in case. She insisted that we must speak to another post office, so it’s obviously becoming a game of ping-pong for them, which means to the client that s/he will have to spend dozens of hours of work trying to restore ‘justice’. I spoke to them for 10 minutes with a clear need to insist very strongly that they take action immediately, as their excuses are bad. They claim that the Office has no record of the package which is supposed to be track-able (registered mail), so they must urgently do something, right? But no. In a perfect (or better) world, they would look into it and issue compensation, but when there’s a monopoly on the mail system and the customers are assumed not to be right, no wonder the service is terrible. Time after time they get away with the same excuses, the same terrible service, and the same reduced productivity to everyone around.

People may be losing their jobs as the mail services implode, but society will have better services — i.e. ones that actually work and put the customer first — replace those. For me, the mail is service is something to avoid by all means possible. It’s just trouble. I would rather send and receive no parcel than deal with this ‘random delivery’ mode.

Retrieval statistics: 18 queries taking a total of 0.202 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|