Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: U.S. DoD catching on?

begin  oe_protect.scr 
Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> espoused:
> __/ [ Mark Kent ] on Friday 21 July 2006 14:03 \__
> 
>> begin  oe_protect.scr
>> Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> espoused:
>>> __/ [ Mark Kent ] on Thursday 20 July 2006 01:35 \__
>>> 
>>>> begin  oe_protect.scr
>>>> B Gruff <bbgruff@xxxxxxxxxxx> espoused:
>>>>> Appears so:-
>>>>> 
>>>>> http://www.acq.osd.mil/actd/articles/OTDRoadmapFinal.pdf
>>>>> 
>>>>> Clearly, the U.S. DoD is anti-capitalist, anti-American, full of
>>>>> commies.... but it does seem to be in favour of OSS (and that's not the
>>>>> Office of Strategic Service!)
>>>> 
>>>> I wonder if the days of government excess are drawing to some kind of a
>>>> close?  There was a time when governments could throw shedloads of cash
>>>> at projects and nobody would turn a hair, whereas now, there does seem
>>>> to be a lot more interest in the details of what's going on.  That said,
>>>> the US budget deficit is the stuff of legend.
>>> 
>>> Throwing  money at Microsoft is like employing your own  son
>>> though.  It  makes such news more encouraging than,  let  us
>>> say,  the  French  moving to Open Source. On  the  issue  of
>>> budgets,  it  is  rather  sad that  funds  for  research  is
>>> declining  in the States. Education is affected in line with
>>> this  trend,  which  leads  to  a  dangerous  cycle  wherein
>>> youngsters  become  less literate and less able  to  compete
>>> with the tech world. Then you have the debates about A levels
>>> and GCSE's, which were argued to get easier just because st-
>>> udents are doing better...
>> 
>> That's not quite the issue.  Where it went wrong with GCSE and A-level
>> is that the traditional system imposed a distribution on results, such
>> that a fixed proportion of people would get an A,B,C,D,E,F grade each
>> year, so the system was naturally consistent.  You knew that if you
>> interviewed people with three As, then they'd come at the top of their
>> group/year/cohort.
>> 
>> The new system attempts to set an arbitrary line, and attempts to
>> determine who has gone beyond it, and who has not.  It's a bit like
>> giving every 100 metre sprinter who beats 10 seconds a gold medal,
>> rather than giving gold to whoever comes first, silver to whoever comes
>> second, and bronze to third.  It got so bad that in the end, the
>> government had to invent a new grade, the "nursery" A* grade.
> 
> 
> *Aha*! I carry on learning new things in COLA. So I guess the grading suffers
> from the same issue that psychometric exams such as the SAT's resolve by
> scaling the results and ensuring a flat distribution of scores.

That's exactly right.  Back when I did O-level and A-level, the system
was normalised and then a distribution done by percentages of
candidates.  It worked very well indeed, as the numbers of people taking
O- and A-level were high enough to apply simple stats to the problem.
It also left schools the opportunity to challenge grades where
individuals didn't do as well as expected.  

It meant that you were competing against everyone else in the cohort,
whereas now, you're just trying to do a 10 second 100 metre race.

> 
> I didn't take part of the school-level educational system in the UK, as you
> can probably tell...

Hehe..

> 
> 
>> From an employment perspective, I'm interested in how candidates have
>> performed against their peers, because I'd like to pick the best of this
>> year's crop, as it were.  The present system is such that most of the
>> people I now interview are not remotely capable of doing what I'm
>> looking for, at least those from the UK, and we employ more and more
>> people from abroad.
>> 
>> This is a crime, it's a letting down whole generations of students who
>> have no way of proving their level of competence against their peers,
>> and employers do not have the time or the resources to do it for them.
> 
> 
> Interesting. I can think of scenarios where tests truly had this issues.
> Returning to aptitude and psychometric tests as an example, practice
> improves people scores. So does it truly gauge intelligence? Or skills? Or
> is it too easy to /adapt/ -- through practice -- to the requires skills set?
> Many say that education measure people's willingness to learn rather than
> innate potential...

My sister's an industrial psychologist at Guelph university - she can
wax lyrical about this for hours on end...

> 
> 
>> The perpetrators of this crime are the governments who wish to claim
>> they're improving education, because they can genuinely show that more
>> students were getting As than before.  If you've no comprehension of
>> stats, then that's probably convincing.   For the rest of us, it's just
>> manipulation of people's lives.


-- 
| Mark Kent   --   mark at ellandroad dot demon dot co dot uk  |
UNIX was not designed to stop you from doing stupid things, because that
would also stop you from doing clever things.
		-- Doug Gwyn

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index