Roy Schestowitz wrote:
> Why Internet Explorer 7 Will Break the Web
>
> ,----[ Quote ]
[note from Microsoft]
Prior to deployment of IE 7, we strongly recommend that you test your
Web sites, extensions, and applications to make sure they are ready.
Microsoft recommends that Web sites and applications are ready for the
release of Internet Explorer 7 this month.
> | Now, I've heard this warning before, but it still gets me riled
> | up to read it. Upon seeing it yesterday, my first thought was,
> | "This is totally backwards! Isn't this exactly the scenario that
> | web standards were put in place to prevent?"
It may also be a direct violation of the provisions made by the court
in the antitrust ruling and settlement.
There are two possibilities. First, Microsoft may have realized that
they have lost ground in the Browser market because of their inability
to support traditional standards such as industry standard CSS. The
new browser is supposed to support the industry standard versions of
CSS and JavaScript. There may be a risk that the new browser will
break "IE-Only" sites that used non-standard features, this is
Microsoft's way of giving the advanced warning and making sure that web
sites who did support IE six non-standard features didn't lose business
because Microsoft was now coming back in line with standards. There
seems to be several indicators that this is the situation.
Of course, there is also the possibility that Microsoft has new twists
and kinks and wants to make sure that these new sites make the
necessary changes to make sure that only IE-7 browsers can access their
sites. That would be a problem. At this point, this does not appear
to be the situation.
> | The governing
> | body of the web, the W3C (of which Microsoft is a member),
> | dictates the standards that browsers must adhere to. That way,
> | web developers can build their sites to comply with those
> | standards and guarantee that any browser visiting their site
> | will render the pages properly.
> `----
>
> http://blog.wired.com/monkeybites/2006/10/why_internet_ex.html
>
> It already proves to be a tremendous pain. This morning for example...
>
> http://comox.textdrive.com/pipermail/wp-hackers/2006-October/009160.html
>
> To quote a comment from a friend:
>
> "Microsoft must break some bad habits... but they are not.
> The article is more concerned with standards and how Microsoft does not follow them,
> leaving the burden of creating IE compatible sites up to the developers.
The biggest problem is that there have been repeated instances where
Microsoft's "extensions" have created far more problems than they
solved:
DHCP tends to get very difficult to audit, especially when combined
with programmable MAC addresses.
ActiveX downloads and executes programs without telling the user.
The security is
supposed to be the certificate which can be verified by the
certificate authority,
but there has been a history of people fraudulently obtaining
these certificates.
Dynamic DNS allows DNS serves to point to DHCP assigned servers, so
that users
can move from one location to another. The problem is that there
have been
many instances of sites being hijacked by spoofers claiming to be
the "legitimate" owner.
WebDAV allowed web sites to be very easily updated from Windows
workstations
using tools like FrontPage, but there have been a number of
instances where
commercial sites, including Microsoft, found their sites "updated"
with content
that was often counter to the owners intentions.
Use of SMB/CIF made it very easy to share files on FAT file systems,
but the lack
of proper user authentication and permissioning often turned these
shared repositories into virus spreading "culture dishes" that
could infect
thousands of computers in a few minutes.
Ironically, many of these problems wire resolved by Linux and OSS, and
these solutions were then implemented by Microsoft in later offerings.
For example, the SMB sharing problem could be fixed by using a secured
file system such as those used on UNIX servers, and authenticating
users. This made it harder for unauthorized users to alter files
intended to be used by hundreds of users. Microsoft used NTFS, which
was widely deployed with the NT 4.0 workstation, to enforce strict
authentication and authorization checking - similar to the security
used in Linux (SAMBA) based servers. The scheme only worked when both
the workstation and the server were fully secured, but eventually
solved this one little problem.
> FTA-
> Of course, Microsoft has a history of dictating their own standards, and they
> have the power to do so because their browser is used by somewhere
> around 80-85% of the world's web surfing audience."
This is what Microsoft thought, even up to a few years ago.
Unfortunately, they soon found that this attitude could backfire. Even
though 80-85% of all browsers used were running IE, doesn't mean that
Microsoft's proprietary standards were being used. Many corporations
disabled features such as ActiveX completely. Many major corporations
would even block access to the certificate authorities used to validate
the certificates, and would "reject" the certificates. As a result,
the 85% of the money required strict aderance to published and
verifiable standards. Some vendors even found that they could lose
substantial revenues by implementing a web site that would ONLY work
with IE features such as ActiveX and VBScript. Many companies even
sent notices to the site managers when warnings were received, warning
them to remove the demands for ActiveX control execution or their
entire DOMAIN would be blocked as a security risk.
When you want to sell to a company like K-Mart, WalMart, General
Motors, or Prudential Insurance, and they tell you that you'd better
adhere to public standards or no one in the company (including the
purchasing department) will be allowed to visit your site, you adhere
to the standards.
When a company like IBM, with 400,000 users, interacting with external
clients, customers, and vendors, starts putting FireFox on the
corporate web site, and tells vendors to make their web sites FF
compatible or their contracts would not be renewed, it's a warning you
don't ignore lightly. How would you like to be the travel agency for
IBM and know that unless you made your web site FF friendly by the end
of the year, that you would no longer be their travel agent?
Now consider that there are about 200 million FF users, (20% of the
browsers on the market), and another 200 million who run with ActiveX
disabled and no ability to enable them. Then consider that many of
those 400 million users are employees, customers, and customers of
customers. Do you really want to tell these people "go away, we don't
want your business" by refusing to display anything unless they are
running IE with ActiveX enabled? Do you really want to tell them, "you
can't really trust us, you shop elsewhere" by giving them a message
that says they won't get all of the important information they need,
unless they make their browser vulnerable to viruses and worms?
> http://digg.com/design/Why_Internet_Explorer_7_Will_Break_the_Web#c3430533
This is the big problem for Microsoft. They believed that they could
just thumb their noses at published industry standards. They believed
that they could add their own proprietary "enhancements" and simply
tell them "we're better" without giving details. But as more and more
successful attacks on millions of computers, with each attack costing
$billions in labor, time, work lost, and prevention, are traced back to
these "enhancements", IT managers are reminded of why open standards
and open source were first used.
Go all the way back to 1984. AT&T has just divested it's baby bells,
and is now selling UNIX. They are also getting license fees for code
included in BSD Unix. But these are large computers, often being used
by as many as 100 concurrent users. Many of these systems are running
on VAX 11/750s and VAX 11/780s. Applications are available in binary
form for these machines, but if there are problems, and the UNIX
machine crashed, 100 users could be idled for 2-3 hours at a time.
Since you don't know when the computers will be fixed, you can't send
them home and let them make up the time later. Since all of the
information they need to do their work is on the computer, they can't
really do any real work. If you try to force them to work overtime to
make up for the lost productivity, the last "forced march" will be the
primary topic of discussion during the next crash, costing morale and
leading to attrition. Since these are highly skilled workers, and
highly productive workers, they are prime targets for recruiters, and
morale is critical.
UNIX is also available on several different types of hardware.
Applications for BSD systems are available in source code, along with
the operating system, for a mere $45,000. That seems like a lot of
money, until you realize that failures due to server crashes, even if
it only happens once a month, and the average salary of your users is
$40,000 per year, you are losing 2400 hours per month, or nearly
$48,000 per year. In practice, the early "binary only" systems often
crashed about once a week. At only $20/hour, that would be $2000 per
week in lost wages. If these workers are producing at a producitivity
of $10 per dollar paid, that's $20,000 per week in lost productivity.
USENET was developed by UNIX administrators FOR UNIX administrators.
It was similar in concept to the e-mail used in ARPA-NET, the main
difference was that most corporations using UNIX didn't have the
high-speed dedicated connections of ARPA-NET. They used UNIX to UNIX
Communication Protocol - UUCP, to send messages over dial-up lines in
short bursts. Calls were typically scheduled about once an hour
between hosts.
The UNIX administrators used USENET to send source code to each other.
Soon they started mailing lists, to make sure that everybody could get
source code changes within a day or so. The net result was a radical
improvement in the quality of the software. Later, "newsgroups"
created accounts on each machine which could be shared by all of the
readers on that computer. Messages could be sent by topic. One of the
topics was net.sources, which was where the source code was sent.
Because all of the users could read the sources, and since most of the
users were programmers, it became common practice to review the source
code for bugs. When someone tried to send a trojan, it was caught.
The publisher was identified, his administrator was notified, and
remedies were immediately taken.
At almost exactly the same time, MS-DOS was starting to have it's own
problems with trojans in binaries. Small companies were selling
software in binary only form, written in assembler, compiled BASIC, or
compiled C. Malicious code was being passed in the boot tracks of
floppy disks. Many of these may have been copy protection schemes gone
berserk. It didn't matter. In the UNIX community, administrators
began demanding that all programs submitted in net.binaries also be
submitted in net.sources. This was about the time that Richard
Stallman and a team of people in net.legal started exploring the
possibility of using copyright law and copyright licenses to FORCE the
disclosure of source code. Specifically, they didn't want people
taking source code submitted to net.sources, then publishing
proprietary versions for popular platforms or MS-DOS - without the
permission of the original authors.
By 1987, Microsoft had nearly total control over the PC market. There
were contenders such as DR-DOS, and GEM, but Microsoft was doing
everything they could to make sure that MS-DOS remained the sole
operating system used on PCs.
Also in 1987, the "Morris Worm" was released. A "back door" which
allowed files to be copied using UUCP, then executed using UUX, caused
the sudden failure of about 5,000 UNIX servers across the country. By
this time, the average UNIX system was being used by around 200 users.
It made headline news, and made many people very upset. The UNIX
community began to realize that the best way to prevent such a
recurrance, was to use ONLY published protocols and Open source
software for communication between systems, and to make sure that it
was not possible to push code into a machine and execute it without the
Administrator's permission.
In less than 3 years, the security policies and stanards adopted by
UNIX administrators had made UNIX as secure as most mainframes. By
1990, the Morris Worm was a bitter memory, but it continued to drive
security requirements, even as MS-DOS machines got "stoned", and
celebrated "Columbus Day" and other interesting days, by executing
malware code that was spread through floppies containing shareware but
also containing trojans.
Microsoft completely ignored the standards established by the UNIX
community. Gates even had contempt for them. After all, he had sold
Xenix to Radio Shack back in 1982, almost 2 years before developing
MS-DOS for IBM. Gates had sold most of his interest in Xenix to SCO,
and by 1989, had divested all of it's interest, retaining a 25%
interest in SCO and assuming that he could regain control of SCO with
little effort if necessary.
Even when handling issues as simple as the termination of lines, Gates
insisted on using the carriage return and Line feed, to directly drive
the printer, rather than using the line feed as a record terminator and
letting the software decide how that "line" should be handled in the
printer.
This was the fundamental design difference between UNIX and MS-DOS.
MS-DOS was designed to do as little as possible. It would launch a
single application. MS-DOS 1.x didn't even have directories. Features
from UNIX were added, one at a time, and only when absolutely
essential, and only in the simplest possible form. And when these
features were implemented, they were implemented with little "twists"
which made them incompatible with UNIX.
UNIX went the other way. The kernel and libraries provided 90% of the
code required to implement applications, and simple components could be
compiled into self-contained units which could be plugged into each
other via "pipes" and "streams". The datastream was very revolutionary
approach, because it meant that the application didn't have to know
WHERE data was coming from, or where it was going to. Furthermore,
because the pipelines used very little memory, any amount of data could
be processed through the stream. Applications were written to parse
the incoming stream as the bytes were recieved, and to take appropriate
actions, as the information required was recieved. Again, this
substantially reduced the amount of information that had to be kept in
physical memory.
Applications didn't decide how to format output to a printer, the
information was formatted into metafiles, which could then be sent in
device independent format to printer "filters" that could format output
for simple text displays, complex typsetters, or sophisticated WYSIWYG
Graphical displays.
The irony is that the X10 and X11 projects ended up making a Windows
based interface similar to those used in the Xerox Alto, the Apple
Lisa, and the Sun/1 workstations, available for UNIX in a similar
device independent manner that had been used for terminals, printers,
and other display devices.
By establishing and conforming to industry standard protocols, and
using source code contributed by those wishing to sell hardware and to
have the standards adopted, and having specifications so complete that
college students could implement them, many of the complexities of
nondisclosures, patents, and other legal obstructions to progress were
avoided. It was this support of public standards, implemented in
published "Open Source" code, that made the commercial Internet
possible, as well as the World Wide Web. Even the Web Browser,
including the core engine used by Microsoft, was based on code
published in source code form under an "Open Source" license.
When Linux was developed, the kernel was pretty simple. But it
provided a foundation upon which other Open Source software, originally
developed for UNIX, could be built. In less than 18 months, nearly all
of the OSS software available for UNIX had been ported to Linux. This
included BSD libraries, X11 servers, displays, toolkits, and
applications, and even applications published in open source licenses
by Sun Microsystems. Because Linux could be installed on the same
machines that were then being used to run Windows 3.1, it became
possible to have a full-blown UNIX-Like system that ran most of the
applications that had been written for UNIX, for less than $1,000,
about the same price as a Windows 3.1 machine.
Today, Microsoft knows better than anyone, how pervasive OSS software,
and Linux, actually are. They also know how pervasive UNIX in the form
of Mac and FreeBSD is. They are watching the Mac commercials and
probably not liking it any more than the OEMs. Here is Mac, offering
the standards, reliability, security, stability, and flexibility of
UNIX, while still offering the ability to run Windows programs.
Meanwhile, Dell, HP, Lennovo, Sony, Toshiba, and Gateway, are making
machines that are ready, willing, and able to do the same thing, but
they can only ship these machines with Windows XP. Microsoft has GOT
to be feeling the pressure from these guys.
Microsoft can't afford to get a rash of broken sites, virus
infestations, and other disasters after the release of IE 7. Such
activity could spell disaster for Vista.
|
|