Roy Schestowitz wrote:
> A cathedral of formats or a castle of cards?
Let me preface this whole posting by reminding the reader of the
fundamental nature, and purpose, of a standard.
A standard is a formal or informal agreement between multiple vendors,
customers, end-users, and supporters. This agreement generally
requires the comprehensive review and approval of all members and
affected bodies. The review process helps to identify the failings,
drawbacks, and vulnerabilities of a proposed standard. In most cases,
these issues can be resolved by the review process. Most standards
bodies also want to have a "reference model" in public domain or GPL or
OSS software, that can be used by implementors to validate conformance
to the established standard. In fact, the test of the standard for
completeness is that you could hand the documentation for the standard,
or a portion of the standard, and in a few days, or a few weeks at
most, could have a classroom of 30 students produce at least 20 working
implementations.
When the standard is public (either public domain or public license),
these classroom assignements become "patent proof".
Keep in mind that public standards are adopted for the purposes of
architecture.
The objectives of the IT Architect - beyond the core functionality,
includes:
Performance
Stability
Availability
Security
Supportability
Operational support
Information Lifecycle
Performance
Is the standard as efficient in terms of memory, CPU cycles, and I/O as
it can be? Complex supersets often become very unruly and can
adversely impact performance.
Stability
Is the system stable? Are there race conditions, deadlocks,
bottlenecks, or priority conflicts that will cause a system to freeze,
fail, or corrupt data being stored or retrieved.
Availability
Can the system run under full loads for extended periods of time?
Memory leaks, recursions, or other situatios where even minor garbage
collection isn't effectively managed, can often mean that a system has
to be rebooted, or applications have to be restarted, simply because
the resources have been consumed by the rogue functions.
Security
Does the system make sure that only athorized users can access only
what they are authorized to acces, and only in the manner authorized?
If a spammer can send you an e-mail that tricks outlook into reading a
program and executing it, giving a complete stranger full access to the
entire hard drive, and possibly the entire network, you have a serious
problem.
Supportability
Can authorized administrators access and support the system using
low-bandwidth remote connections? If you MUST use ONLY a graphical
user interface, the bandwidth may be problematic. Often, when a system
is being hit with a denial of service attack, or has been caught in a
routing loop, there is very little bandwidth available and a low-speed
"teletype" connection may be the only way to find out what's actually
going on. Ideally it should be possible to do ANYTHING from a remote
site. If the building containing the machine were evacuated, could
remote administrators get the critical information and make logistical
changes?
Operational support
Are there monitoring and auditing provisions which allow support
personnel to monitor, detect, and audit the cause of system failures?
If a hacker does sneak in through the back door, can the system capture
enough information to provide probable cause for search warrents?
Would the system provide enough information to assure the criminal
conviction of the intruder?
Information Lifecycle
Is the information easy to create and edit? Is it easy to store? Is
it easy to track changes? Is it easy to locate in a large archive?
Are there limits on the type of search, version control, archive, and
back-up software? Will available storage be used efficiently? If I
have 1,000 employees creating 1 gigabyte of new content monthly, can I
quickly identify what needs to be archived, archive it efficiently, and
manage it securely? Could I properly respond to a court ordered
discovery request 3, 4, even 10 years after a file was stored in the
archive? Keep in mind that the biggest problem here is that a company
might need 1 terabyte per month per 1,000 employees to keep track of
all attachements in all e-mails, even though identical information may
be stored in 20-40 other places within the same 24-48 hour period.
Posting response follows?
> ,----[ Quote ]
> | ECMA has strong ties with Microsoft,
Can anybody confirm this? Seems to me like Micrcosoft has a 25 year
history of absolute contempt for standards and standards bodies. When
Microsoft does participate, it's primarily to try and force-feed back
doors for Microsoft "enhancements" that end up turning into massive
security holes for worms, trojans, viruses, spyware and malware.
Remember when Microsoft decided to throw it's weight into W3C? We
ended up with ActiveX being implemented and doing $60 billion/year or
more in damanges.
Remember when Microsoft offered to help define the DVD standard?
Suddenly providing DVD-CSS drivers for Linux was a criminal act. Even
the links to the site in norway was considered a criminal act.
Suddenly Microsoft and the MPAA were trying to force a 15 year old kid
who had spent his whole life in Texas - to California, where they could
find a friendly federal court judge, and the kid wouldn't have access
to family, friends, and a jury
> | and the very philosophy of the
> | ECMA is to acknowledge existing technologies and call them a
> | standard.
Usually there are some conditions for approval though. The provider
has to provide a complete specification, with no subsets or supersets,
that is fully implementable using only the available specification.
Approve a standard that doesn't meet that criteria, and the standard
doesn't get adopted. If a standards body approves too many standards
that don't meet the criteria, and the standards body itself loses
credibility. This may be Microsoft's true goal all along. Let's face
it, if standards bodies who have established standards which all of
Microsoft's competitors have accepted, and this standard threatens
Microsoft's ability to maintain it's monopoly, they can strong-arm a
non-standard standard, which does not meet the criteria to be a
standard, and when the standard publishes this as a standard, the
standards body loses it's credibility as a standard bearer. Microsoft
has tried to do this to ISO, W3C, OMG, IEEE, MPAA, MPEG, and even the
IETF.
We ended up with DCOM being confused with CORBA, ActiveDirectory being
confused with LDAP, DHCP being confused with RARP, ActiveX being
confused with HTML, XML being confused with SGML, IDE being confused
with SCSI, and so on. In each case, by the time the standards bodies
and the members who had supported the standards previously established
by the standards body realized that they had been sold a 'pig in a
poke, with lots of lipstick' - Microsoft's proprietary monopoly
building standard and monopoly protecting standard had become the ONLY
standard supported by Microsoft, and there was no opposition from the
competing standards.
What would have happened if SCSI had been adopted the FIRST time that
IDE turned out to be too small (1992)? We might have seen 2-3 gigabyte
drives much earlier.
What would have happened if CORBA had been adopted instead of
DCOM/COM+/.NET back in 1997? We might have seen huge distributed
networks of real-time interactive applications based on CORBA standards
such as those used by Orbit.
What would have happened if LDAP had been adopted unstead of the detour
into ActiveDirectory? We might have seen single-sign-on security
across Windows, Linux, Java, and Apple - even mainframes, with
Mainframe level security.
What if we had adopted SGML instead of Office based ActveX controls and
Office plug-ins? Perhaps we would have been using LinuxDoc based
technology to create integrated multimedia content directly compatible
with the major publishers such as McGraw-Hill, Gannett, and
Scripps-Howard.
And what will we lose if we adopt OpenXML instead of Open Document
Format? The ability to create documents, drawings, spreadsheets, and
databases which are fully implemented in the standard, instead of
simply having envelopes for Microsoft's proprietary Office document
objects such as OLE and ActiveX objects.
What if we had adopted RARP instead of DHCP? We would have had audit
trails back to most of the malware, spyware, viruses, trojans, and
spoofs, along with the ability to block most of the more common
attacks.
Nearly all of Microsoft's anticompetitive "standards" have left huge
security holes. Was this by design? Was Microsoft trying to leave
itself a back-door to audit piracy? Or was it just that Bill Gates
didn't bother to talk to all of the hundreds of contributors who had
defined the established standard that Microsoft was trying to disable?
Microsofts "anti-standards" have almost always had negative results.
Safeguards established by the established standards are often disabled.
Perhaps if these standards bodies were held liable for endorsing
standards which created the access to $60 billion in damages per year,
they would be less eager to take the token hand-outs offered by
Microsoft.
The problem is that Microsoft thinks in terms of it's $40-60
billion/year revenue stream, but doesn't think in terms of the $600
billion in primary, secondary, and collateral damage that occur when
their ill-conceived anti-standards are endorsed and adopted.
> | This view is quite opposed to the one of the OASIS
> | consortium, because the consortium does try to design
> | specifications based on consensus, plausible engineering
> | decisions and not on the fait accompli.
Refer back to my original description of the intent, purpose, and
nature of a true standard.
ECMA can say "this is a standard" but that does not make it so. In
this particular case, major vendors, corporate customers, and support
vendors may just decide "we don't want to have to deal with this".
> | [...]
> | Microsoft's Open XML file format may perhaps be one day an OSI
> | standard (who knows, so many things are OSI certified, one day
> | Microsoft may be able to certify its own business practices as
> | the ISO standard for monopolistic position and anti-competitive
> | behaviour) but it will never be an open standard.
OSI totally abused the term 'open'. When ISO was first conceived, the
proposals were based on BSD and MIT licensed software and student
implementations of CCITT specifications. As the OSI standards process
expanded, IBM wanted SNA and APPC added as a superset, but didn't want
to provide documentation. DEC wanted DECNET added as a supersent, but
didn't want to provide documentation. Novell wanted NetWare to be
added as a superset, but didn't want to provide documentation. But
even simple tasks like the equivalent of an IP address or DNS were
never accepted and agreed upon.
> | [...]
> |
> | By writing 6000 pages, something else strikes many, including myself:
> | no human can implement that. In fact, nobody aside Microsoft will be
> | able to rightly implement it because Microsoft is the only one can
> | deal with the previously existing formats.
Keep in mind that most of these previously existing formats (something
like OLE, DCOM, COM+, .NET, and ActiveX objects) are not included in
the ECMA standard.
> | For these 6000 pages are
> | thousands of man/years of confusion, users' lock-in, con-formating
> | of data, IP and jealously kept trade secrets.
Remember one of the "acid tests" of a standard. Could you break these
specifications into 60-70 page "packages", assign each "package" to a
classroom of 20-30 undergraduate students, and have at least 2/3 of
them implemented a correctly functioning implementation that conforms
to the standard? If the answer is "no", then it's not a standard, it's
just a political endorsement of a well-funded proprietary standard
based on bribes, political infighting, and "vote packing".
OSI eventually became irrelevant as a standards body and the IETF
standards were adopted instead. The IETF struggled to retain their
credibility as as standards body, often rejecting proposed RFPs,
because the specifications weren't specific enough.
> | And you would expect
> | that anybody might come up with something that works?
Again, not just anybody, but an undergraduate student with a knowledge
of C or Java.
Prefereably a Sophomore or Junior. This assures that if a vendor does
implement the protocol, that the implementation cannot be monopolized
with a patent (since there are 20-30 unpatentable implementations).
> | Apple, by the way, will not. Because Microsoft Office for Mac will not be
> | able to use Open XML for some years, as I have learned.
That and the fact that Open Office and StarOffice works on Macs running
OS/X.
Gates messed with Jobs big time. He ripped off Jobs and then tried to
bankrupt Apple.
Neither Jobs nor Apple executives, has that bad a memory. Both Jobs
and Gates used to say "keep your friends close, and your enemies
closer". Perhaps Apple will go public with an announcement to ignore
OpenXML entirely and support Open Document Format instead.
> | So good for the great open file format of Microsoft. 6000 pages cannot
> | be a standard.
OSI was nearly 50,000 pages. This included lots of pictures, lots of
duplicated fluff, lots of smoke and mirrors, and huge holes in the
implementation that prevented anyone from successfully implementing the
OSI standard.
> | It is FUD. It is a scandal, and a digital wart
> | in the industry. 6000 pages cannot be reputed conformant by
> | anybody else than their author. And their author is Microsoft.
> `----
Ultimately this may be the biggest problem of all. If the ONLY
contributor is Microsoft and it's paid committee members, then it is
highly unlikely that anyone will actually adopt it as a standard.
Government agencies will ignore the standard and declare it a
non-standard. Vendors will simply state "we don't support that
standard", and with no 3rd party providers, it will go the way of OSI.
> http://www.libervis.com/blogs/5/charles/a_cathedral_of_formats_or_a_castle_of_cards
There might be a handfull of good ideas in the OpenXML standard, and
perhaps some of these will be integrated into ODF or other public
standards and OSS implementations. Microsoft would normally be
required to surrender all intellectual property rights claims over the
actual specification itself. Failure to do so would be the "kiss of
death" in terms of non-support. It would be a "nonstandard".
|
|