(Radio Passioni) – In questi giorni sul BBC Internet Blog si sta discutendo di visual radio, in seguito al lancio, il 12 gennaio, di un nuovo player che offre una inedita interfaccia in tempo reale sugli studi, i luoghi di produzione della radio. Non si tratta di trasformare la radio in tv, ma nell’individuare nuove chiavi di interpretazione cross-mediale per un mezzo che per certi versi non basta più a se stesso e sta cercando il modo di confluire sulla piattaforma Internet. Senza snaturarsi certo, ma evitando di sparire.
Cliccate qui per visualizzare il contributo di Tristan Ferne che spiega i dettagli tecnici, troverete tutti gli altri link. Il trial non è più in corso, ma le considerazioni dei Radio Labs sono molto interessanti.
How visual radio works
Tristan Ferne 16 Jan 09
Yasser introduced visual radio on Monday and the trial has been going on this week. There is one last chance to catch it – on Annie and Nick this Sunday night (7pm – 10pm). The technical team behind it have written this post to give some idea of how the technology driving this system works. So over to the tech team of Conor Curran, Sean O’ Halpin, Chris Bowley, Duncan Robertson, Nick Humfrey, Craig Webster, Ant Smith, Terry O’Leary and Will Kinder
One of the key elements of this project was to provide the ability for the editorial team to control the user client in realtime from the studio.
We considered HTTP polling but in the end found that the only way we could achieve the low latency and scale we required was to push messages to the client. We tried and rejected XMPP, mainly due to its verbosity, and there being no decent support for pub/sub in any Ruby XMPP library.
The first solution that showed promise was Juggernaut. This works by embedding a small Flash client in the HTML page for the sole purpose of providing an XML socket connection back to the server which it bridges to javascript. The server side is written in Ruby and integrates very well with Rails. Unfortunately, our tests showed that Juggernaut cannot yet scale to the levels we required. However, it provided the inspiration for our eventual solution.
As it turns out, we were already using a very similar solution to display LiveText on our network home pages via a third-party realtime messaging service provided by Monterosa called En Masse. Putting together what we’d learned from Juggernaut with the proven scalability of En Masse, we were able to piggyback our protocol over the existing messaging channel.
When a user loads the Visual Radio console, the En Masse server opens an XMLSocket connection to the connecting client and throughout the lifetime of the connection it will push XML messages to the Flash client.
Messages are fed to the En Masse server from our back end ActiveMQ messaging system. All the back end processes are written in Ruby using our own smqueue library.
To control all this and provide the studio with the realtime control they needed, we built a Ruby on Rails web application which sends messages to the client via the messaging infrastructure. If the Radio team want to push content to all the people connected via the Visual Radio console, they will activate the specific module they want to show within the Rails admin application and a chain of events occur. At a high level, what happens is:
A message containing a url to a resource is put on a queue by the admin application
A process watching this queue, gets the message and parses it
A request is made back to the resource url above, which returns an XML packet
This XML is then posted to a server
This server then messages all clients connected via their XMLsocket connection
The client parses the XML and displays the information to the user
Now Playing Information
Whenever a track gets played on Radio 1 we receive a file from the playout system, containing the details about what was played via an HTTP POST request. This message is then put on a message queue, so that it can be archived and sent to other systems.
The track data is sent to us in a proprietary text format, created by the playout system vendor. So the first stage is to parse it into a data structure that is easier to process. This is then put back on to another message queue, again so that other systems can make use of it.
The next message queue processor looks up the artist information in MusicBrainz using the artist name and the track title for disambiguation. If it unambiguously finds a matching artist in MusicBrainz then we add this information to the message, which can then be used to fetch an image and biographical information from http://www.bbc.co.uk/music/artists/:artistid
Architecture
This may be just a trial but because it’s being broadcast live we need it to be reasonably fault-tolerant. To that end we’re using a high-availability Apache based web tier which proxies requests back to multiple application servers. Each application server connects to a high-availability MySQL based database tier. In the event of one of our servers failing another will automatically take over with minimal disruption of service.
To manage the high-availability web and database tiers we’re using open-source software called Heartbeat. Our application servers are all Mongrels and are running Rails. Underpinning everything we use Linux running under Xen which provides efficient virtualisation of our physical hardware.
Streaming Audio and Video
The vision mix output along with the audio feed from the studio was encoded using a Flash media encoder. The On2 VP6 codec was used for the video and MP3 for the audio. This feed was streamed over an SDSL connection to the BBC’s Content Delivery Network. The Flash client once launched attempts to connect to this stream hosted by the third-party.
You can read more about developing visual radio and how the Flash console works over at some of the developers’ blogs.