Archive for the ‘Security’ Category
Sadly, many people use a convenient argument to defend Windows’ security problems. They would like you to believe that security is failing because of relative market share, not inherent security, which one can attain through proper design. Windows was built to serve users’ convenience while neglecting to account for the subsequent inclusion of an Internet connection. Windows was very desktop centric, as Gates’ snubbing of the Internet has proven over the years. That, and only that, is why Microsoft struggles to rewrite a vast codebase in a quick and secure fashion that leads to mature and well-established libraries.
The following articles demonstrate and explain why Windows is simply insecure by design. Market share plays a relatively minor role in this equation.
- Linux Security: A Big Edge Over Windows
- Security Report: Windows vs Linux
- Microsoft Windows: Insecure by Design
- If Only We Knew Then What We Know Now About Windows XP
- Why Windows is a security nightmare
Consider more secure platforms, preferably ones that confirm with the POSIX/UNIX model that has matured over many decades. Keep the cr4ck3rZ working much hard(er).
There has been a great deal of talk about browser statistics recently. Market share has become a measure of diversity, which ensures that Web developers tailor their site according to standards rather than for one particular application. Security remains at the heart of this debate, but it’s clear that the complexity of this problem is high.
All Web browsers are insecure to some degree, because they all must work with flawed code in the operating systems. There are some indications of progress, such as frequent patches from Microsoft and Mozilla to close security holes. Still, these actions may be too little too late if a zero-day exploit is the attack weapon.
It all comes down to patching speed, then number of flaws, as well as their severity (e.g. privilege escalation can be catastrophic).
Related article from the same day (and same Web site):
shall continue to argue that Microsoft software (and Windows primarily) is slow and too cumbersome to work with. It discourages high productivity levels. Might this explain why that company from Redmond has produced so little in the past 5 years? Let us discuss.
I am shocked to see a software behemoth with so many employees still struggling to ship products on time. I can recall that rusty O/S called Windows XP, which was released when I was a teenager. It’s amazing that Microsoft has achieved so little in the past half a decade. All it has been able to get is just another ‘Service Pack’, to be ready some time next year. This one has a different name and a new theme, Aero Glass (see above). It also bears a hefty price tag.
Linux users may like to handle complexity, but meanwhile it seems as though their codebase is far more maintainable than that of Microsoft. In case you have not followed some key events, 60% of the codebase of Windows must be rewritten as it’s an utter pain to extend.
Let us take a step further and discuss the issues of security, diversity, and competition. Windows was not built with security or multiple users (network thereof) in mind, so it is merely ‘patched’ to bridge that crucial gap. A one-man election might work with Windows-based Diebold machine. Windows is, after all, a single-user O/S with some ‘hacks’ that make it possible to be used by multiple users in a network that involves more than a single user (e.g. Internet). And it’s worrisome. This has led to cyberstorms and makes the Internet a less pleasant place than ever before. Patches take long to issue because, in a codebase with ‘hacks’, there are just too many dependencies to consider. There is poor modularity. This monolithic approach leads to flakiness and unpredictable behaviours.
Is diversity the answer? Is a staged migration to more mature and reliable platforms the path to secure computing? I have little or no doubt about it. But this will not be easy. It is only natural to assert that Microsoft is doing illegal stuff to stifle its competition. Such a software industry vermin deliberately restricts ‘diversification’. There’s no parity in the industry that outmuscles any competitive threat before it matures to match the behemoth. It strives towards a state of mono/oligopoly and the law offers no barriers as it’s being tweaked by lobbyists.
Competition, you argue? I see none, but luckily people begin to see this and respond accordingly. It’s a false sense of competition when a startup needs and depends on vendor X in order to develop a product to compete with vendor X. That’s what Microsoft does through operating systems, distribution channels, licensing, and programming languages. A stronghold on the market may soon be broken, at least in Europe. The remainder of the world is secretly/quietly migrating to Linux, albeit the scale of this is not being blown out of proportion using advertisements. There is no marketing department in a Free software initiative.
A flood of bad news (for Microsoft) has rolled its way onto the headlines. It all happened yesterday, as well as earlier on today. I believe some quotes will speak better than their detailed interpretation.
The pair of worms surfaced over the weekend, several security companies said in alerts. The malicious software tries to hijack the computer for use in a network of commandeered PCs that can be remotely controlled, popularly called a botnet.
It makes one wonder how games are affected. The XBox series shares the same DNA as Windows.
Using malware or software designed to infiltrate a computer system, hackers steal account information for users of MMO games and then sell off virtual gold, weapons and other items for real money.
Windows mobile likewise.
Since developers are not in a hurry to keep their users information secure… we feel compelled to publish – with exclusivity granted to us by author till August 21, 2006 – an article, that reveals various problems with Windows Mobile software from various software vendors! This article is a “must read” for any serious user of Windows Mobile…
Lastly, a security expert implicitly explains why Windows needs to be rebuilt. Jim Allchin, the main architect of Windows, has already said that 60% of the source code needs to be rewritten! It is no wonder that there was a “development collapse” in September 2005, according to Steve Ballmer. Windows Vista is the product of just 6 months in development (plus testing).
Failing to acknowledge or fix an infrastructure plagued with problems raises many doubts about any security product’s ability to function in such a foundation. Placing more complexity on top of existing (and flawed) complexity does not lead to increased protection, but rather, fosters a false sense of increased protection.
That is a lot of trouble to digest in just one day. The implications are SPAM and DDOS attacks, the vast majority of which is spewed from hijacked Windows machines (‘zombie armies’ or ‘botnets’). Sadly, I am among those who are affected by both detriments.
- For malware developers
- For spammers
- For extortionate botmasters
- For spam filter developers
- For firewall developers
- For anti-virus developers
All of the above are nasties or software that defends against them. All of them exist and prosper owing to the fact that Windows was never built with security in mind. I can’t help feeling bitter as I am among the sufferers, despite the fact that I touch no Microsoft software. In a matter of just one week, a 30-megabyte mail account got clogged up by SPAM. The amount that comes in is so sheer that I cannot afford to even look at all the subject lines; rather, I go by patterns and highlighting-type filters. It is unbearable as I am skipping some genuine mail.
Windows botnets have brought the Internet to a dark age. Some people question themselevs as to whether conceding the use of E-mail altogether is the better way. And as for collaboration-based, Web 2.0-ish software, I have already been forced to disable much of its function (e.g. registrations, comments, and open Wikis). I also needed to block 2 IP address yesterday, due to continuous abuse involving heavy and continuous spidering of my main site. At least the abusers’ ISP‘s were alert and they quickly took action. These attacks came to their end yesterday. They were not the first though. It is a recurring pattern.
Several years ago I said that SPAM was a problem that did not affect me and I would rather just ignore it. But I am afraid that it is no longer possible. And if Microsoft does not protect its O/S (Vista was already proven to be hijackble) or loses a very significant market share, things will not improve any time soon. They will only get much, much worse.
ODAY I would like to explain, in a relatively shallow level of depth, my most basic backup routines. I will concentrate on a somewhat simplified perspective and that which involves my current backup approach for local files, as opposed to the Web. The method is largely automated, owing to cron jobs (scheduler-driven). More details and method were described in older blog items. For example, have a cursory look at:
- One-Click Backups
- Powerful Backup Scheme
- Basics of Data Recovery (XHTML presentation)
- Mail Backup and Restoration
- Backup Fallacy
- RAID Redundancy
At present, I continue to seek and stick to a robust backup mechanism that is rather immune to human error, as well as hardware failures. I take the ‘stacked backup’ approach (keep several cumulative/progressive backups) and I always remain paranoid, as to be on the ‘safe side’ of things. I fear (and maybe even loathe) situations where I might lose data as this costs a lot of time and can even lead to considerable emotional pain, especially in the case of irreversible loss. As a result, I have scripted all my backup routines. I can just set it all up and thereafter forget about it, so frequency of backups can be increased without extra cost (time). I would like to share a few commands that I use in this blog post, for whatever it’s worth. Here are bits referenced from the crontab file, as well as some corresponding and related scripts.
First of all, here is a command which takes all the settings files (beginning with a dot) and puts them on the external storage media, datestamped. It’s possible to go further and compress (e.g. using
gzip), but it makes the entire process much slower.
tar -cf /media/SEA_DISK/Home/Home-Settings/home-settings`date +%Y-%m-%d`.tar ~/.[0-z]*
Here is a simple way of preparing a datestamp-named directory.
mkdir /media/SEA_DISK/Home/`date +%Y-%m-%d`
I then take all files to be backed up, slicing them into volumes of 1 gigabyte (the filesystem will not accept files that exceed 4 gigabytes in size).
tar -cf - /home/roy/Main/BU|split -b 1000m - /media/SEA_DISK/Home/`date +%Y-%m-%d`/Baine-`date +%Y-%m-%d`.tar
Lastly, important files that change frequently are copied without any compression.
cp -r /home/roy/Desktop/ /home/roy/.kde/share/apps/kpilot /home/roy/Main/MyMemos
/home/roy/Main/kpilot-syslog.html /media/SEA_DISK/Home/Misc_local #local
I prefer to send copies of these files off-site as well, just for the sake of redundancy.
konsole -e rsync -r /home/roy/Desktop /home/roy/.kde/share/apps/kpilot
/home/roy/public_html firstname.lastname@example.org:/windows/BU/Sites/SCG #and remote
In the above, Konsole is just a convenient graphic-textual wrapper for these operations that spew out status or flag errors, shall they ever emerge (a rarity).
I use tape archives to retain nightly stacks. Every night I use
rsync to replicate my main hard-drive and to avoid the existence of deprecated files, I create a fresh copy twice a week, using
rm -rf followed by
scp (could be
rsync as well, in principle) and a storage unit whose total capacity is 0.3 terabyte keeps stacks of the files before each
rm -rf operation. Here are some bits of code which are hopefully self-explanatory.
konsole -e rsync -r email@example.com:/home/roy/* /home/roy/Main/BU/ &
For a fresh copy of a remote home directory, begin by erasing the existing files.
rm -rf /home/roy/Main/BU/*
rm -rf /home/roy/Main/BU/.[0-z]*
Then, copy all files using a simple remote copy command.
konsole -e scp -r firstname.lastname@example.org:/home/roy/* /home/roy/Main/BU/ &
The stacked backups that are dated get deleted manually; and quite selectively so! One should permit reversal to older states of the filestore by leaving sensible time gaps between retained backups. This prevents backups from being ‘contaminated’ too quickly. Important files are often replicate on file/Webspaces, so the most I can lose if often less than one day’s worth, due to hard-drive failures that are physical. The files are kept on 3 separate archives in 2 different sites in Manchester (home and the University; used to be three sites before I left one of my jobs). All in all, I hope this inspired someone. If not, at least it would serve as a page I can reference friends to in case they seek something similar.
More tips on *nix-oriented backup can be found in a recent article.