Introduction About Site Map

XML
RSS 2 Feed RSS 2 Feed
Navigation

Main Page | Blog Index

Archive for the ‘Technology’ Category

Lawyers Who Don’t Use Encryption When Suing Government Entities With Access to Intercepted Material (Mass Surveillance)

And why every law school should teach everyone about encryption before any other “IT skills”

Industry

THERE IS a disturbing trend which is shared among pretty much all lawyers and other ‘legal’ professionals. I know because I checked. I also know because I saw how my friend, Pamela Jones (the paralegal behind Groklaw), got spooked by the spooks and stopped writing online after she had rejected my offer to use encryption about 8 years ago (saying it would only attract more attention). These are smart people who seem to be ignoring the threat of surveillance even when the threat is out there in the open, thanks to people like Edward Snowden. A lot of what Snowden showed had been known to me for years, but now there is undeniable truth which even the NSA’s chronic lies cannot cover up and shed uncertainty on. Ignorance is no longer a valid excuse.

I currently have a very strong case against a decision from the British government. I am sure I’ll win, the only question is when and at what cost (I have already spent thousands of pounds on it). I am not going to elaborate on it until the case is over, whereupon I will also release sensibly redacted papers (removing personal information) and explain the abuses which I have become aware of and personally suffered from. These abuses have impacted at least 4 people that my solicitor alone (an activist against torture) is working with. Nationwide, therefore, there may be thousands of such victims. It’s hard to say for sure how widespread this type of abuse has become, but one can estimate by extrapolation. In the future I will also file a formal complaint about it, then pressure my Member of Parliament to take action (not just yet).

Now, let’s deal with the key issue — or ‘beef’ — of this post. As in any legal case, papers are being sent back and forth, often electronically. It’s a practical thing to do because of speed (instantaneous for images and text). The stuff which the solicitor and I have already exchanged over E-mail is known about to the respondent, which has copies (this includes a request for appeal). Some stuff does not necessarily need to stay under the table, especially when it is accessible to both sides. Just as one requires no anonymity when purchasing a flight ticket (because the ticket itself already eliminates any chances of anonymity), for some documents it is fine to be visible to the opponent. There is not much to lose there.

But then there’s more sensitive stuff, like strategy.

Lawyers and barristers should always send sensitive stuff encrypted and sent over securely (to secure client-solicitor privacy/privileges). E-mail is one of the least secure methods of transferring data. It’s almost as thought it was designed for surveillance and profiling/linking people, but in reality it just got exploited by spooks and the protocols never adapted to counter these inherent deficiencies (encrypted mail still exposes the identity of the sender and recipient/s). Face-to-face or snail mail are better because bugging is hard and in the latter case it’s hard to achieve un-obtrusively, e.g. opening envelopes and re-sealing them. Since GCHQ and some government departments (e.g. Home Office) work together on increasing surveillance, right now under the guise of ‘emergency’ as if we’re in wartime, we can assume — pessimistically — that they may be studying the cases against them based on interception and preparing themselves based on this prior knowledge, or increased awareness. This is of course not acceptable, but then again, we already know that obeying the law is not our government’s best strength. That’s a debate for another day. In another circumstance one could probably chat or write about these issues (I know that my solicitor too advocates human rights at some capacity), but this is not the subject of this post.

As one who write prolifically on issues of national security, I have good reasons to suspect I have no privacy, unless technical measures are taken to protect it. I encrypt mail where possible. But I depend on others doing the same. Encryption is not a one-end preference, it needs to be agreed on and embraced by both ends.

People don’t want to jeopardise a case by unnecessarily giving away strategic arguments to the opposing side; I have seen people (usually in the US, some of whom I know online) on whom subversive means were used (illegal actions by those in power) to intimidate, harass, libel, etc. Completely bogus charges can be made up and hyped up in the media, framing of a person is very common (digitally too), and drainage of one’s resources through legal fees is also a common tactic of vendetta.

Any solicitor who wants to take on the government of his/her country absolutely must learn to encrypt. But this should not be limited to cases like these. Several months ago it turned out that the US government had spied on a US law firm which was working to advise a foreign nation on trade negotiations (this is a corporate matter). We know these types of abuses do happen in the West, so lawyers must learn to protect themselves. Unless they can sue to stop these practices (illegal actions by their government), they will need to adopt technical means of overcoming these dangers.

Perhaps I have become too cynical or too pessimistic when it comes to my government obeying the rule of law, but based on some recent revelations, the record supports me. We are living at times of lawlessness for the rich and powerful and oppression (through tyrannical laws and overreach) for the rest.

Thoughts on Privacy on the Web

Cookies and cross-site connections help track Internet users in ways far worse than most people realise. People assume that when they visit a particular site then it is this site alone which knows about them. Moreover, they assume that they are logged off and thus offer no identifying details. In reality, things are vastly different and it is much worse when public service sites act as “traps” that jeopardise privacy. A site that I recently looked at (as part of my job) does seem to comply with some of the basic rules, but new advisories are quite strict. To quote: “The UK government has revised the Privacy and Electronic Communications Regulations, which came into force in the UK on 26 May, to address new EU requirements. The Regulations make clear that UK businesses and organisations running websites in the UK need to get consent from visitors to their websites in order to store cookies on users’ computers.”

The BBC coverage of this indicates that “[t]he law says that sites must provide “clear and comprehensive” information about the use of cookies…”

Regulating cookies is not enough. ISPs too can store data about the Web surfer and, as Phorm taught us, they sometimes do. They sell information about people.

In more and more public sites, HTTPS/SSL is supported and cookies remain within the domain that is “root” in the sense that the visitors intended to visit only this one domain (despite some external bits like Twitter timelines in the sidebars/front page. Loading up Twitter.com, even via an API, might help a third party track identities). Shown in the following image is the large number of cookies used when one accesses pages from Google/GMail (even without having a GMail account).

Cookies

Although SSL is now an integral part of this service (since the security breaches that Windows caused), privacy is not assured here. Although they don’t swap cookies across domain visitors, Google’s folks do track the user a great deal and they have many cookies in place (with distant expiry date) to work with.

Information on how Google will use cookies is hard to obtain, and the problem is of course not unique to Google cookies. Most web browsers automatically accept cookies, so it is safe to assume that about 99% of people (or more) will just accept this situation by default. If a site had provided visitors information about cookies, permitted secure connections (secure to a man in the middle) and not shared information about its visitors, contrary to the EU Commission which foolishly wanted to put spyware (Google Analytics) in pages, then there is at least indication of desire to adhere to best practices.

Cookies are not malicious by design as they are necessary for particular features, but to keep people in the dark about the impact of cookies on privacy is to merely assume that visitors don’t care and won’t care about the matter. And that would be arrogant.

To make some further recommendations, privacy should be preserved by limiting the number of direct connection to other sites. Recently, I have been checking the source of some pages to see if there’s any HotLinking that’s unnecessary in public sites, which would be a privacy offense in the sense that it leave visitors’ footprints on another site. Outbound links can help tracking, but only upon clicking. The bigger issues are things like embedded objects that invoke other sites like YouTube. HotLinking, unlike Adobe Trash, cannot result in quite the same degree of spying (Google knows about IP address and individual people). If all files can be copied locally, then the problem is resolved. Who operates linked sites anyway? If it’s a partner of a sister site, then storing files remotely might be fine, but with AWS growing in popularity, Amazon now tracks a lot of sites, e.g. through image hosting.

Sites like Google, Facebook (FB) and Twitter, if linked or embedded onto a Web page, can end up taking a look at who’s online at the site. All it takes from the visitor is the loading of a page, any page for that matter. FB is often criticised for the “like” button too (spyware). JavaScript (JS) has made the spying harder to keep track of; it would be best practice to perhaps offer JS-free pages by default, which limits viewing by a third party assuming those scripts invoke something external. Magpie RSS can help cache copies of remote data locally and then deliver that to the visitor without the visitor having to contact another server when loading up the primary target site. Some sites these days have you contact over 10 different domains per pageload. It’s the downside of mashup, and it extends to particular browser components too (those which “phone home”, but the user usually had more control over them than over known and unpredictable page source). Google and Microsoft uses their cookie to track people at both levels – browser and in-page (sometimes under the guise of “security”, babysitting and warning about “bad” sites you visit). Facebook and Twitter only do the latter and a lot of people don’t welcome that. Facebook, notoriously, profiles people (e.g. are they closeted gay? Is there fertility/erectile dysfunction? Any illnesses the person obsesses over?) and then sells this data to marketing firms and partners, reportedly Microsoft too.

Public sites have different regulations applied to them because many people are required to visit them (e.g. paying taxes), it is not a choice, not to mention the sovereignty principles (e.g. should Google know who and when and how European citizens access their government sites which they themselves paid for?).

In society there is a lot of ransom going on — a lot of ransom people do not regonise or will never be known or reported. This relies primarily in information, unless there is a physical hostage situation (where the prison is at danger of mortal harm). But the bottom line is, those who have the potential to embarrass others possess a lot of power, so there is a fundamental issue of civil liberties at stake. This is why, among several reasons, the TSA agents stripping off (literally or figuratively, or in scanner) is a way of dehumanising and thus weakening the population, normalising indecency and maybe returning us to memories of some human tragedies. The privacy people have is tied to their indignity, worth, and sense of self/mutual respect. Privacy is not a luxury; it is an important tenet of society. Society will suffer if privacy is altogether lost.

GIF Animations in LATEX

LATEX helps render for a variety of output types including posters and Web pages, not just A4 sheets. As a typesetting language it is very powerful, but for advanced functionality it requires additional packages, included in the preamble. It appears as though GIF animations are not supported in LATEX despite the fact that, if exported as Web pages for instance, the notion of animation makes sense. This is a shame really and if someone knows of a workaround, please leave a comment. I am currently writing a 400-page report which is a comprehensive summary of what I am doing and without animations it might be hard to express what is going on. For example compare the following triplet of static and dynamic (which HTML is happy with):

New Interview With Me

Head over to Muktware where there is there is this new interview.

YouTube Versus Television

TECHNOLOGY moves on and one must adapt to it. “Luddite” a word that is often used to discourage those who deviate from the norm, such as those who refuse to carry a mobile phone everywhere they go.The term Luddite in this case refers not necessarily to rejection of progress but to conformity of lack thereof.

Television is a generic term which refers to a device for remote viewing of something. Conventionally, however, we think of television as a set although some large bits of furniture or even projectors might nowadays qualify as televisions. What is common in almost all of them is that, with the exception of streaming or on-demand viewing, television is controlled by broadcasters, who have a lot of control over the viewer’s mind many hours of each day. The viewer can typically select the least undesirable channel among a finite number. Just because there are many channels these days does not mean one can watch a lot of them at once (simultaneously), so this limitation remains. The choice is elusive.

YouTube is different for several reasons as by its nature it allows anyone to broadcast and it also gives the viewer a lot more control over what is being watched. This is why I stopped watching television and eventually gave my set to a friend. The set was of no use anymore. It felt more like a device for passing commercials and clips that I did not wish to see. Sure, there were exceptions, but those were very rare. To choose a channel is still an illusion of choice as that hardly leaves much selection in the hands of the viewer. The choices are preselected by other people.

Recently I started to get more actively involved in YouTube not as a mere viewer. Back when YouTube presented statistics on how many videos a given account has watched the number 22,000 came up and since then I have watched probably about 50,000 videos on YouTube. So I pine to become part of those who contribute. In the coming weeks I will convert some older material and upload it to YouTube. It may be an interesting experience. Can a viewer engage in a two-way exchange of information? That certainly would be beneficial to society as it can weaken the power of media empires over people’s minds. It can also help promote the TechBytes show to people who never heard about it. At the very least as an experiment I shall see how it goes. This might be rethought.

Legacy Pages

CLEARLY, when one writes/maintains a Web site, keeping pages up to date is a tough task and the bigger the site is, the harder it gets. Updates can be made to the appearance of pages, as well as the content. Unlike newspapers, for example, sites can be accessed 10 years down the line and not carry a timestamp to indicate that the information in them may no longer be accurate. This is fine in the case of schestowitz.com because most pages contain some sort of timestamp (most pages here are about 6 or 7 years old). Even when pages get updated if makes sense to keep the old content in tact, at least as a form of legacy. That’s what I did over the weekend with the introduction page, which someone complained about as it was about 7 years old and needed a refresh. The bottom line is, for certain types of sites, keeping them up to date is a monumental task. Webmasters do not deserve hassle for it.

Free Software More Than a Hobby

Throughout my career I’ve always had many eggs in the basket. I’ve usually had multiple jobs and I was never fired; I always succeeded in job interviews (since 2003), except the ones with Google, which came to me three time (I never approached them regarding a job). One thing I’ve learned over the years is that one must choose a job one enjoys, otherwise it’s a chore. I never accepted a job that I disliked. I have been working in two jobs simultaneously several times (simultaneously as in overlapping months/years), sometimes on top of already being a full-time Ph.D. candidate/student. I still work two jobs and I very much enjoy both; it’s like leisure as there is a sense of achievement. Besides all of this, as a hobby I maintain some sites that promote freedom; I was never paid for this. This is part of my reading of material; it’s like a learning experience which also proved beneficial to many others — those who share interests with mine. Being enthusiastic about freedom comes very naturally.

After many years wanting to be running an independent business on the side I’ve decided to start creating a professional site. The original idea was to come up with a new name (and domain), but after much consideration I came to the conclusion that giving visibility to a new name and new site would be a lot of work. As this new blog post from Forbes correctly indicates, reputation matters a lot when seeking business. That’s why I decided to stick with my surname and in the coming days/weeks there will be a formal announcement regarding my third job, in which the work capacity cannot be guaranteed (depends on clients). The focus is affordable scientific computing solutions that put the client in control. In essence, it is about spreading free/open source software and charging for the scarcity, which is skill and (wo)man hours. There is nothing unethical about it.

Together with some friends (I shall add people to the appropriate pages), a new logo, CMS theme, and a soon-to-be redirection (dupe of index.htm will ensure all the older pages remain accessible), schestowitz.com will soon have a sort of relaunch. The site no longer attracts about 3,000 visitors per days like it used to (back in the days when it was regularly updated), but we shall see if it takes off not just as a personal workspace with a lot of informal pages. I remain very much committed to all my jobs; starting something as my own ‘boss’ will just be something on the side.

Retrieval statistics: 21 queries taking a total of 0.308 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|