Introduction About Site Map

RSS 2 Feed RSS 2 Feed

Main Page | Blog Index

Archive for the ‘Security’ Category

Detexian Reviewed

I am an early adopter of Detexian, a service which I increasingly rely on for security. My wife and I run a small media entity which attracts about 5 million hits a week. The sites are and One of the sites is modest and non-confrontational, whereas the other one (the latter) is more controversial because it is critical of activities such as bribery, illegal surveillance, and all sorts of corruption. There are certainly people and organisations that are willing to spy on and undermine the site. Some of those who get criticised are large technology companies and institutions they work with.

We cannot keep up with logs because we are a small team and we cannot properly analyse these for security threats. It is just infeasible. For analysis of logs we also require a service which is isolated from surveillance-intensive hosts such as Amazon. We moreover operate on a very small budget as the sites are public services rather than for-profit.

We now rely on Detexian to inspect the traffic and generate concise reports. Detexian helps to avert disaster or alert about troubling patterns in activity before disaster strikes or flaws are found/exploited. and are not young sites. They have been around for nearly a decade and a half; over the years we have suffered more DDOS attacks than we can remember and there were also intrusion attempts (none were successful). Some attacks managed to cause damage, but it was always repairable. Recently, Detexian alerted us about SQL injection attempts and made recommendations.

We shall continue to rely on Detexian in the foreseeable future and are happy to pay for the service knowing that someone “has got our back” and is providing informed advice on how to guard the sites.

Cyber Security a Matter of Life and Death Sometimes

CIA interference


When oil rigs/platforms sink (recall this incident) a lot of people die. When gas pipes explode a lot of people can die as well. In the case of BP, Microsoft Windows was at least partly to blame for the incident (I wrote about this many times at Techrights) and the above, just (re)published by Wikileaks, makes one wonder where the US derives its moral high ground from. This shows the importance of using software one can truly control and always trust, such as Free/Libre software.

OpenSUSE Web Site Cracked

And SUSE has not yet said anything about it (to publicly acknowledge this), it seems to have restored from backup or removed the defacement

OpenSUSE Cracked
Click to zoom

How to Patch Drupal Sites

My experience patching Drupal sites is years old and my general ‘policy’ (habit) is to not upgrade unless or until there is a severe security issue. It’s the same with WordPress, which I’ve been patching in several sites for over a decade. Issues like role escalation are not serious if you trust fellow users (authors) or if you are the sole user. In the case of some agencies that use Drupal, it might be safe to say that the risk introduced by change to code outweighs the safety because as far as one can tell, visitors of such sites do not even register for a username. All users are generally quite trusted and they work closely (one must have checked the complete list to be absolutely sure). There is also ‘paper trail’ of who does what, so if one was to exploit a bug internally, e.g. to do something s/he is not authorised to do, it would be recorded, which in itself acts as a deterrent.

If the security issue is trivial to fix with a trivial patch, then I typically apply it manually. When the SQL injection bug surfaced some months back that’s what many people did for the most part. For larger releases (not bug fixes) the same applies, until there is no other alternative. What one needs to worry more about are module updates, especially those that are security updates. One should make a list of all modules used and keep track of news or new releases (watching general FOSS news is usually not enough until it’s too late). Thankfully, detailed information on what the flaws are becomes available, along with associated risks both for core and additional/peripheral modules.

Then there’s testing, which I guess one needs to do for any changes that are made, assuming time permits this. The last major Drupal flaw had a 7-hour window between publication and exploitation in vast numbers (maybe millions). It means one cannot always follow the formal procedure of testing, albeit testing in an ad hoc way or minimising the risk by applying a patch ought to work well. This leads me to suggesting that developers don’t need to have one uniform workflow/process for changing Drupal but a multi-faceted one. Proposal:

If the flaw is

1. severe
2. not back-end (i.e. not related to role management)

consider the complexity of the patch and test immediately on an existing copy of the site, then deploy on ‘live’.

If the patch is a core patch, no alternatives exist. If the patch is to be applied to a module, study the effect of disabling the module (assuming no dependents), consider temporarily keeping it out of reach (public site/s).

For less severe flaws:

1) merge into git on a dedicated branch
2) test on a local vagrant installation
3) schedule for deployment to “development” for testing
4) schedule for deployment to “staging”
5) run regressions (one needs to define these)
6) Client to do any required acceptance testing
7) schedule for deployment to production.

Suffice to say, the changes should not only be made through git (not directly) but a database dump too (or snapshot) should be taken, both for quick fixes and for longer testing purposes because even if changes are revoked (git rollback) the database can be left in a sub-par/inadequate state.

Regressions of interest for Drupal are not just site-specific. There are some nice templates for these and one needs to consider which modules to use in the site. Intuition and general familiarity with the CMS loop/hooks help one predict what impact a change would have on modules, if any. Drupal has good documentation of functions (by names), so these too can be studied before changes are made. To avoid some modules ‘silently’ breaking, following any change to core (or even modules) one may need to go through a list of tests. specified in advance, that help verify no module spits out PHP errors or behaves oddly. It is common to test critical pages first, e.g. finding an authority, research reports, etc. Sometimes it should be possible to also automate the testing by basically making local snapshot of pages of interest and then diff‘ing them after changes are made, using sophisticated tools like Versionista or a manual side-by-side comparison by a human operator. There are browser extensions that further facilitate this, but caching such as Cloudflare, varnish cache etc. can impede this process (even though changes to underlying code may invoke an override, at least for varnish).

Regressions are nice, but in many cases developers don’t have time to run them and a simpler set of manual checks can help gain confidence that changes made have no detrimental effects.

I cannot recall ever having major issues patching (as opposed to upgrading) the core or WordPress and Drupal and I have done this hundreds of times. The quality of testing when it comes to core (not external/additional) is quite high, but another worthy step is, before making any changes, look around forums to see what experience other people have had. There were cases where patches were problematic and this quickly became public knowledge; sometimes workarounds or patches for the patches are circulated within hours.

Background reading

CCTV Not Effective

Surveillance camera

WITHOUT a doubt, there are circumstances where evidence extracted from CCTV is valuable. For instance, if there is a street/pub brawl, one can use footage to verify or falsify eyewitness accounts or the story told by those involved in a brawl.

For the most part, however, CCTV fails to justify its great cost, not just monetary cost but also the cost to our civil liberties. Today I got a good reminder of that.

Having spent nearly an hour speaking to security personnel and the local police, I found that CCTV did, in fact, capture the stealing of my hybrid bike (retails at around £500) roughly two hours ago. This was captured because I only ever park and chain my bike to solid objects like designated bike rails in front of cameras and in the presence of many people.

Not only did several cameras capture good footage of my bike being stolen but also the store manager (the store I was in for just 10 minutes) was at the parking lot witnessing the crime. Was that enough to prevent the crime? No. To capture the perpetrator? No. To return the stolen bike? No.

The perpetrator wore a hoodie, so it is hard to identify him (the footage only identifies him as a black man in his mid-twenties, to quote security personell who investigated it). It is too early to assume that the bike won’t be returned and the perpetrator caught, but the matter of fact is, CCTV, as I long argued (for many years), does not help prevention and rarely helps identification.

If the perpetrator is very naive, in which case he or she is removed from the scene early on, then it might work, but the hard cases cannot be resolved by CCTV. All that can be achieved is the confirmation that a certain crime occurred and in cases where an insurance agency is involved, it may help prevent insurance/benefit fraud. My bike was not insured. I don’t know any people who buy bike insurance.

Surveillance tools which are run and owned by the state (or law-enforcement agencies), as in CCTV, are not there to protect and arguably they do not serve as a deterrent either. They are probably not worth the investment. More people need to be on the ground, creating more jobs and adding to real security, not sci-fi pseudo-futuristic security theatre.

European Union/Commission Saves Us From Big Brother


According to the news today (the theme one comes across by listening to any radio station in the UK), continental Europe comes to the rescue again. Here in Manchester, getting a cancer-causing scan is mandatory for boarding any plane in Manchester Airport. This is very profitable for some companies and their cronies who devised these ludicrous measured due to one guy with explosives in his underwear (an old incident whose casualties count is 0). As I have been stressing for almost a year, those machines that scan people as though they were suitcases are assured to kill (in the long run) more people than they would save by preventing explosives from going on planes through one’s breast area, crotch, etc. The whole thing is a sham and a cancer-generating pipeline that makes some industrialists rich. So anyway, the news here is that removal of all such machines has just been demanded by the authorities in the EU (probably Belgium and the surrounding aristocracy). This is good. No more will I need to confront airport staff over their stubbornness; why should they impose X-ray scans as a sort of blackmail prior to travel? What have we as a civilisation sunk to? And that’s not even delving into other issues such as the acquisition (with alleged retention) of naked pictures of every citizen who travels on a plane (via an increasing number of airports). Civil liberties — not just our health — are being jeopardised without taking simple risk calculations into account. Several months ago I did some maths related to this and came to the conclusion that unless those scanners can prevent 200 large planes from going down by detecting a passenger with explosives that cannot be detected in other means, the deaths due to the cancer will be greater. In order words, by placing those machines in the airport (lethal X-ray rebranded) they sign the death knell/sentence of many people and hardly save any lives. On numerous occasions I had discussions about this with staff who works around those machines and never could they provide a compelling explanation for why they participate in it (big brother cooperation). Perhaps the “I’ve got a mortgage to pay” is the best they can do. One persuasive method is to clarify to these people that their health too is at great risk and information about it withheld. Hopefully those machines will all get canned just lime the ID cards. Liberty and security don’t sit well together.

Improved Surfing and Browsing Notes

OVER time, I’ve accumulated a few notes about things that I bear in mind while surfing the Web or communicating with people. Here’s a quick breakdown.

Saving Evidence

Corporations, unlike us mere mortals, don’t care about preservation more than the law requires. That’s why the Bush Administration, Intel, Microsoft and many other companies purge E-mails and shred documents without any guilt or hesitation. I should really make copies of everything I cite (I rely too much on the Web Archive). A friend of mine wanted to automate this and create archives of all my posts, plus local copies of cited articles. With wget, these can be sorted nicely by URL, but I never do this. If you have any tips, please leave a comment.

WWW Privacy

A person can always use a proxy if there is fear of having IP addresses harvested. I set up this thing for myself a couple of years ago and I had to make it password protected because lots of people from Asia used up my bandwidth. Then, the only server that ever sees my IP address is mine. You’re essentially passing requests through a trusted middleman.

Since all routers are peers as well, there’s no way to get perfect independence from other people’s vigilant eye. One option is therefore to use Web mail via proxies, but then you’re still relying on the the proxy not giving away its log files (or destroying them every night).

I suppose you may have heard by now about the police demanding encryption keys from an animal rights activist over here in Britain. There’s increased mail monitoring as well, which is why I suggest that people get themselves covered. There’s also that Blogger (Google) incident from last month. Google gave away the IP to the police and used some excuse. Think of mini-microsoft (the blogger) and many others who rely on anonymity. Without anonymity, they will lose their jobs. It seems like an approaching end of an era. Authorities require greater control. They exposed too often, so they misuse their powers.

E-mail Privacy

Have you readers considered encrypting your E-mails? I do this with people who can. If you’re using Thunderbird and there’s an extension called Enigmail that will make it very simple for you to set up. It’s cross-platform too.

Remember that privacy is among your right. Don’t let people take it away from you and essentially treat you like a criminal. With less privacy, you are left powerful and exposed.

Retrieval statistics: 21 queries taking a total of 0.111 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|