Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: [News] PlayStation 3 Could Win Owing to Blu-Ray, XBox360 Alienates Foreigners

BearItAll <spam@xxxxxxxxxxxxx> espoused:
> Mark Kent wrote:
> 
>> Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> espoused:
>>> __/ [ BearItAll ] on Tuesday 08 May 2007 08:29 \__
>>> 
>>>> Roy Schestowitz wrote:
>>>> 
>>>>> Pachter: PS3 Will Win Via Blu-Ray
>>>>> 
>>>>> ,----[ Quote ]
>>>>> | It takes a special kind of guy to want to write 207 page
>>>>> | reports for a living. A man of vision. A man of honor.
>>>>> `----
>>>>> 
>>>>>
>>>>
>>>
> http://kotaku.com/gaming/the-pachter-factor/pachter-ps3-will-win-via-blu+ray-258309.php
>>>>> 
>>>> 
>>>> That wouldn't really be a reason for Xbox360 fall further into the
>>>> doldrums. The Xbox can link to users computers, so they only would need
>>>> the one blu-ray device. Alternatively, if I remember right the xbox360
>>>> had a USB connector, I suspect a lot of first time buyers of blu-ray
>>>> will go for an external device anyway.
>>>> 
>>>> Thinks: It just crossed my mind that a firewire connector might be
>>>> needed for blu-ray rather than USB, I should think that unless the
>>>> device was given a huge system buffer USB may not be able to keep up.
>>> 
>>> Microsoft said it would support Blut-ray if this media type won the
>>> so-called 'race' or 'war'.
>>> 
>> 
>> A huge system buffer cannot make up for insufficient bandwidth, though.
>> At least, not unless you do the download /before/ you watch anything,
>> but this is not streaming, this is, well, a forward/store and then watch
>> model, which is not the same.
>> 
> 
> The CPU doesn't need to play a large part in the actual transfer, just dma
> updates. Processing of audio isn't really an intensive task for the CPU, it
> just needs a buffer large enough that as it finishes each block the next
> block will be already available. Similar for video but the buffer is just a
> bit larger.


Blu-ray is both audio and video, isn't it?  In any case, the issue is
*not* about processor intensiveness, it's about the temporal relationship
of the data being maintained *at all times* within some extremely
stringent parameters.

The buffer size is the wrong end of the stick entirely - as soon as you
have a limited bandwidth channel in place, particulary if it is in any
way shared, then you have a huge problem on your hands.

My issue was that the buffer size is not important if the channel is not
large enough.  If the channel is shared, then it is impossible to create
a buffer large enough /unless/ you make it so large that you basically
collect the whole file and play it on, which is a file transfer, rather
than a stream.

> 
> For blu-ray where the actual data per frame is greater plus what ever the
> algorithm for decoding it, then a USB with a reasonably large buffer might
> do the job. I don't know the actual details of blu-ray, just the odd bits
> and pieces that are available on the Internet, so I don't really know how
> CPU intensive it is.
> 
> Streaming isn't really that different from this sort of buffering. 

Oh yes it is, Sir!  It's completely different.


> The
> primary difference is that processing can not look behind, so what is
> passing through filtering or processing is not dependant on what is yet to
> come and has no more than knockon effect from what has already been
> processed. Of cause the streaming process can make use of controls codes
> for the streaming system itself, which in itself can change the processing
> of data that follows. But it is still a steam, like a fast river, it flows
> in one direction, it wears trenches from the river bed that affects the
> water following, that water might be carrying rocks that changes the flow
> of the water that follows. 

Sorry - entirely the wrong analogy, pretty, but not appropriate.  

The critical piece, which you have not recognised (most people don't),
is that the human brain cannot cope with jitter, wander, lost packets,
and so on, but for a audio stream, the stream terminates in the head of
the listener.   Gaps are not allowed.

The second piece is that IP networks are lossy by design.  It doesn't
matter how much you try to play around with RTP or UDP, IP itself is
lossy, and lacks the means to guarantee delivery.  

Thus, you cannot guarantee the bandwidth of the channel carrying the
stream, indeed, it can fall to zero on occasions (indeed, that is
precisely what does happen).  If your buffer plays out, and they often
do, then the stream stops, along with the enjoyment of your
listener/viewer.  As I said above, the buffer size is not the issue
here, except for a couple of things - buffering adds further latency,
which is in itself a problem, particularly for conversational work, and
also for sync (a/v sync, say), but a large buffer cannot make up for an
unrealiable channel, it might do it *sometimes*, but it cannot do it
*all the time*.  

And finally, telecoms streams are *bi-directional* not unidirectional,
as most of them are conversations, so it gets more complex then, but
that's a problem for another day.

> 
> The first of the true streaming systems was on the 68xxx serial devices,
> they could filter on the fly in the communications device. But to maintain
> streaming rates it had to be on a word by word basis, no sub processing
> that might affect the flow. An example might be xoring with a comms key.
> 

... the first streaming systems were and remain the telecoms networks and
the radio/tv broadcasting networks.  The telco networks have been moving
voice signals, as streams, at 8000 frames/sec (125uS frames) at 64kbit/s
(56k in the US) since the late 1970s/early 1980s.  They do it on the fly,
all the way around the world, the interleave the signals up to GBit/s,
and unravel them reliably at switching and flexibility points, and drop
them out to a customer.  The signalling system uses addressing which
puts IP to shame, any number of devices can be addressed directly by
the customer, even mobile devices.  There is no need for nat/pat in the
C7 environment, just as there is very little buffering, almost no echo,
minimal latency and virtually zero packet or data loss.  What's more,
each call is tracked, and billed.  End to end latency is almost always
within 250ms, and almost never > 400ms.  Echo control is always used
with latency > 25ms.

And finally, all that goes on using about 20% of the bandwidth which an
equivalent VoIP call on the internet requires.  Now there's progress :-)

Anyway, I suggest you have a read of the PBB-TE article on Wiki for more
information - it's very good.

-- 
| Mark Kent   --   mark at ellandroad dot demon dot co dot uk          |
| Cola faq:  http://www.faqs.org/faqs/linux/advocacy/faq-and-primer/   |
| Cola trolls:  http://colatrolls.blogspot.com/                        |

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index