  <?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet title="XSL_formatting" type="text/xsl" href="/blogs/shared/nolsol.xsl"?>

<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel>

<title>
麻豆社 Internet Blog
 - 
Brandon Butterworth
</title>
<link>http://www.bbc.co.uk/blogs/bbcinternet/</link>
<description>Staff from the 麻豆社&apos;s online and technology teams talk about 麻豆社 Online, 麻豆社 iPlayer, and the 麻豆社&apos;s digital and mobile services. The blog is reactively moderated. Posts are normally closed for comment after three months. Your host is Eliza Kessler. </description>
<language>en</language>
<copyright>Copyright 2012</copyright>
<lastBuildDate>Mon, 27 Oct 2008 13:00:00 +0000</lastBuildDate>
<generator>http://www.sixapart.com/movabletype/?v=4.33-en</generator>
<docs>http://blogs.law.harvard.edu/tech/rss</docs> 


<item>
	<title>History of the &apos;麻豆社 Redux&apos; project</title>
	<description><![CDATA[<p>At  in August 2008, we showed developers an internal research project called "麻豆社 Redux". At the time, it was mentioned in  and on , and you may have wondered what it is.</p>

<p>In the summer of 2007,  (me too),   and others were calling for the 麻豆社 to , and the  had .</p>

<p>Slated for a Christmas launch, the system only supported Windows Media with  and  delivery. There were some concerns from ISPs about how this might affect them.</p>

<p><a name="1top"></a>Cross-platform support has always been a source of grief <small><sup>[<a href="#1bottom">1</a>]</sup></small> leading us to duplicate systems and to reduce functionality in order to support multiple formats equally.</p>

<p>The problem wasn't going to go away and with the expansion of video and audio into devices such as games consoles, it was getting worse.</p>

<p><a name="2top"></a>We needed a different approach, to divorce our content production from delivery format and method - something quickly adaptable to new devices and the unmentionable "C word": . <small><sup>[<a href="#2bottom">2</a>]</sup></small></p>

<p>I'd had an idea for a system to do this festering for a few years - the mother of all  systems. I'd proposed this at the start of the iPlayer project and now it seemed time to just get on with it.</p>

<p>Armed with a couple of trusty developers, Tom and Dickon, a demonstrator was quickly built. We called this project "麻豆社 Redux". </p>

<p>It's a video-on-demand test bed where we can try out the systems that acquire, store, search and deliver content. It's like your  but much larger and can convert and stream content to suit playback on more than just TVs.</p>

<p>So how does it  work?</p>

<p>To start with we ingest , though sourced from , as direct programme feeds take time to organise. There's a server that records each stream and segments it into programmes just as your PVR does.</p>

<p>Each content recorder (there may be more than one of each subsystem for redundancy) offers its files to the content manager(s) that look after storage distribution and management. The content manager(s) determine which content store to save them on. Then there are front-end servers to test applications for delivering to users and a stack of transcoders that are the glue between the neutral high quality storage format and what the users want.</p>

<p>In the summer of 2007, we showed our prototype to  and the FM&T board.</p>

<p>The point was made (streaming video could work) and iPlayer was set to have Flash streaming for the Christmas launch (after some quick work by the iPlayer team). A bonus for the ISPs is that Flash streams are approximately a third the size of the downloads and people only transfer them once. Currently, around 90% of 麻豆社 iPlayer use is Flash streaming.</p>

<p>We also demonstrated to the board a  that plays our multicast live streams and iPlayer VoD - part of the longer term aim to make internet TV more usable in the home.</p>

<p>We continued to experiment with 麻豆社 Redux, making it work  for various games consoles - PS3; PSP;  - and when the iPhone came out in November 2007, we added iTouch/iPhone and later 3G phones.</p>

<p><img alt="Brandon Butterworth - image by Chris Capstick" src="http://www.bbc.co.uk/blogs/bbcinternet/img/brandon_butterworth_ipod.png" height="348" width="430"></p>

<p>Some of this work was later used for iPlayer beta sites ( is 3% of use).</p>

<p>, we let the public see and use one of our development front
ends for the first time with the freshly-hacked that week subtitle support - so new it was still being worked on through the Saturday.</p>

<p>Redux was built to support the rapid development of new services and was designed to scale in many directions.</p>

<p>The transparent on-demand  is how we can quickly adapt to new devices. When iPlayer chose to , they wanted a slightly different encoder profile than the one we'd used for demos. No problem: we defined the coding profile, allocated a subset of the transcoder pool to them and let them trawl through the store collecting all the programmes they needed for the seven day window. A few iterations were needed as the launch deadline approached, so more transcoders were added to their pool as time ran out. The risk was too high to try all this in the iPlayer live production systems; they could be adapted later.</p>

<p>Redux has been quite versatile - over the past year it has grown in use as a demonstrator of a tapeless world - a way of trying out ideas like:<ul><li>instant access programme compliance store</li><li>immediately accessible archive - no delays waiting for a DVD/VHS in the post, so programme teams are using it for research.</li><li>a usable infinite archive - why would we ever throw programmes away again?</li><li>adaptable to produce new content formats on demand - when www.bbc.co.uk added  alongside , nobody had the content readily available to make WM versions of all the previous content; now it's not a problem</li><li>IPTV</li><li>closed user group access to content, either in production or, as is now being developed, for TV listing magazine reviewers who used to watch on postal DVD</li></ul></p>

<p>Redux consists of three racks of equipment - two are storage nodes - 342TB according to the disk manufacturers, 297TB of usable space with 152K programme files so far.</p>

<p align="center"><img alt="bbc_redux_kit.jpg" src="http://www.bbc.co.uk/blogs/bbcinternet/img/bbc_redux_kit.jpg" width="430" height="323" /><br><small><em>Full size images in  [ &#124; ]</em></small></p>

<p>There is also a baby one feeding iPlayer as, being a development system, we like to break the main one regularly.</p>

<p>Redux had its first birthday in July. We hope it will have many more before the experiment ends.</p>

<p align="center">&sect;</p>

<p><a name="1bottom"></a><small><sup>[1]</sup></small> Used since 1995 and selected as the most cross-platform system we could find, Real still met resistance from some Windows users: often corporate IT wouldn't let them install it or they and Linux users didn't like the
free player trying to entice them to upgrade to a paid-for version or use advertising.</p>

<p>麻豆社 TV and Radio services work on any manufacturers device, they compete on quality and features, not on exclusive access to content. It seemed silly to have to make special content for any particular internet device manufacturer but that's how the market, governed by commercial rather than technical interests, has developed. [<small><a href="#1top">Return to post</a>]</small></p>

<p>For years, we've encouraged the adoption of standards-based systems, developing , a royalty-free  to reduce commercial barriers to adoption of a common standard. It's  in high end broadcasting, but is slow progressing to the internet. In the meantime, we've been supporting  and , which are taking off. The aim remains - people can choose their player; we don't need to care.</p>

<p><a name="2bottom"></a><small><sup>[2]</sup></small> Internet media are still much like that start of TV broadcasting where multiple systems were run in parallel before choosing one. [<small><a href="#2top">Return to post</a>]</small></p>

<p><em>Brandon Butterworth is Principal Technologist, Kingswood Warren, 麻豆社 FM&T. Image of Brandon by Chris Capstick. Image of protest by . Images of Redux kit by Brandon. UPDATE 24/9/09: Brandon Butterworth is now Chief Scientist, 麻豆社.</em></p>]]></description>
         <dc:creator>Brandon Butterworth 
Brandon Butterworth
</dc:creator>
	<link>http://www.bbc.co.uk/blogs/bbcinternet/2008/10/history_of_the_bbc_redux_proje.html</link>
	<guid>http://www.bbc.co.uk/blogs/bbcinternet/2008/10/history_of_the_bbc_redux_proje.html</guid>
	<category>DRM</category>
	<pubDate>Mon, 27 Oct 2008 13:00:00 +0000</pubDate>
</item>

<item>
	<title>Brandon&apos;s History Of Online 麻豆社</title>
	<description><![CDATA[<p><em>Brandon Butterworth is a Principal Technologist in the 麻豆社's research and development team and the man who first registered the bbc.co.uk domain. He's such a key figure in the history of the 麻豆社's technical infrastructure that he has a room named after him at bbc.co.uk towers. This post is part of the . [Update 24/9/09: Brandon is now Chief Scientist, 麻豆社.]</em></p>
<p>Imagine there's no interweb...</p>
<p>...that's . We have a lot to thank the internet for - besides a new language and .</p>
<p><strong>The first 10 years were the best</strong></p>
<p>When I put the 麻豆社 on the net, it wasn't a  moment; we had to <a href="http://en.wikiquote.org/wiki/The_Hitchhiker's_Guide_to_the_Galaxy">keep banging the rocks together</a> for a while. We had email, file transfer, ,  and piracy since the late 1980s, ours via dial-up  through  to .</p>
<p>The USA had proper internet.</p>
<p>I wanted it.</p>]]><![CDATA[<p style="text-align: left;">First was the grand unification of internal networks, sharing a connection and the cost and bringing together radio, TV and the World Service. I installed a circuit from the first fully commercial ISP in the UK - &nbsp;- and we were on the net. It didn't do much, but what had been terminally slow over batch dial-up became fast and, for the first time, we had direct connectivity to other hosts on the net.</p>
<p>Then the web arrived. We set up www.bbc.co.uk and started playing with mostly laughable content until we found a few like-minded people around the 麻豆社 who had more time and material for producing content. Little of the content is still online - except one of the first sites which is available at .</p>
<p>From then on (summer 1994), programme-makers joined in or started their own projects, such as the . We grew on donated time and hardware, experimenting with technology and content.</p>
<p>Having drawn radio and TV content into the website, we also put internet into programmes. Email feedback seems trivial now, but being able to respond to a programme and have the presenter respond to you on air was far simpler than a phone-in.  questions into live political chat shows hooked News and Radio 3's  programme produced live from user-generated content and streamed the programme.</p>
<p>We had our first foreign () language site too, though we later found that the producer had jumped the gun on a co-ordinated effort by World Service to do many languages, such was the enthusiasm.</p>
<p>Some projects, including the 麻豆社 Networking Club, found the expense of running their own independent services too much to continue and returned to the main site. This aggregation was to prove essential in growing the site and making it one of the most popular and cost-effective on the net. As 麻豆社 budgets were tight, we reduced the cost of adding a new site by having it share the same resources and technology as the rest: add a new feature for one, and they all could add it. This meant less waste, too: no idle resources another needs.</p>
<p>Keeping it together let us negotiate better prices and had a huge impact on use - there was one brand and everyone knew where to get 麻豆社 content. There had been lots of discussion over one domain or many. This was partly decided by squatters' use of some obvious addresses: if we used random domains it was likely that we'd eventually find that an obvious one was already a porn site. Guessing a site name within the site meant the site navigation and search could help them find the right one, guessing the domain either got you an unhelpful message, someone else's site - or just an error message.</p>
<p>That's not to say there weren't disasters.</p>
<p><img src="http://www.bbc.co.uk/blogs/bbcinternet/img/radio1_logo_1998.gif" alt="This is actually the 1998 logo for Radio 1 online" width="55" height="70" />Radio 1 inhaled deeply for its June 1996 site relaunch. The redesign was leading-edge, dynamically generated from Dynamo - a Java server engine for publishing dynamic content. Everything came from the database, which also stored user data for the really dynamic elements such as message boards. It also crashed when more than a few people used it at once, leading to a quick hack to make a regular site from the content. Sadly, lots of interesting features lost their edge but it now worked and the Shockwave elements were fine.</p>
<p><img src="http://www.bbc.co.uk/blogs/bbcinternet/img/john_peel_glastonbury.jpg" alt="john_peel_glastonbury.jpg" width="80" height="97" />This shifted the risk tolerance for many years. It was realised that a working site was more valuable than the shiniest possibility, and that what is impressive in development may not scale when faced with a 麻豆社 size audience.</p>
<p>It wasn't all bad, though: the site also led in other areas that did work. It featured an early version of the . We set up streaming of 30+ live programmes per week (around 80 hours) that played on demand until next live transmission. There were also lots of live webcasts from events such as  made up of audio with a updating webcam, as video streaming hadn't been invented yet.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; ">
<img alt="The UK servers over time: one rack, to three, then two rows of nine" src="http://www.bbc.co.uk/blogs/bbcinternet/thdo.jpg" width="430" height="850" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">The UK servers over time: one rack, to three, then two rows of nine </p></div>
<p>As the site grew, we hit the limit of our Pipex line and set up some servers in Telehouse Docklands where higher-speed connections were cheaper. This was important with all the streaming that became popular as the webcasts had content not entirely covered by Radio or TV.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="Budget" src="http://www.bbc.co.uk/blogs/bbcinternet/bbc_budget1996.png" width="430" height="70" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;"> </p></div>
<p><img src="http://www.bbc.co.uk/blogs/bbcinternet/img/199704xx_newyork_telehouse_.jpg" alt="199704xx_newyork_telehouse_.jpg" width="80" height="106" />"Nothing special," we thought in 1996, "who wants to listen to a ?" That November, though,  lots of people did; the 10-times peak for events became the norm, and the traffic started to stick. It helped News win the argument for ,  though there were only six weeks left when we got approval.</p>
<p>The budget was too small and bandwidth was cheaper in New York; I designed an architecture to exploit that with two sets of servers - one in Telehouse Docklands and one in Telehouse New York, with their own special  server than directed people to appropriate servers farm for their continent.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="The growth of the New York servers" src="http://www.bbc.co.uk/blogs/bbcinternet/thny.jpg" width="430" height="574" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">The growth of the New York servers </p></div>
<p>We also made a system for distributing content to the servers, called . The servers were scavenged from other projects. Most of the site - around 8,000 pages - was made automatically from the election computer via CPS built in a few days.</p>
<p>The split site met a criterion that was to figure heavily later - keep the UK users' traffic independent of the rest of the world's. The UK license-fee payers got preferential access; the rest of the world got whatever was left that we could afford to give them.</p>
<p>Chuffed with their success, News followed on with the , which had our first public video streaming of the Hong Kong handover.</p>
<p>So far, we all had other jobs to do as the site was neither official nor funded. Options were considered including not having one or giving others the content. Pressure mounted, meetings went on and on... and then something happened. There had been a car crash and reports claimed that Diana was in it. There was nobody around to update the homepage, so I worked with a friend at World Service, who was in on the Sunday updating their site, until News arrived. We added a video stream while they built . On Monday, we started planning the funeral webcast and by the Saturday, we'd organised a syndicated stream fed from our site to EU ISPs and .</p>
<p>By a week later - 10 September - the response to the Diana coverage had convinced everyone that the internet would be big and that the 麻豆社 would be there - properly. With an October deadline, there was no point continuing with meetings. A committee wasn't going to make it. A ninja squad was needed.</p>
<p>I got a small bucket of cash and got told to do whatever was needed.</p>
<p>We got a bunch of shiny new Sun servers - 64 bits, too - installed them in Docklands and New York along with large internet links and updated lots of the software ready for the launch at the end of October.</p>
<p><strong>The next 10 years were the best, too</strong></p>
<p>News launched first in mid-November; not expecting the main infrastructure to be ready, they'd rented servers from an ISP. They had our first video-on-demand service: the One, Six and Nine O'Clock  in streamed video - as the  does today.</p>
<p>The main site launched on 15 December.</p>
<p>1998 was year of events that were fun to webcast - , ,  - though less interesting on infrastructure development, as some applications were requested that failed to deliver: frustration all round.</p>
<p>In April we set up a 24/7  stream for World Service in case war broke out again. It didn't, but the stream was left running as it was too popular.</p>
<p>News was having performance problems with its outsourced servers, including ISPs having trouble reaching them. News's own ISP didn't have an open peering policy, leading to questions from others as to why they should pay to get to the News site when they peered with us on the main site for free. This came to a head in November; our offer was to add News hosting to the New York server farm, relieving their servers of non-UK traffic. This helped, and as their ISP was pulling out of the hosting market, News set up a set on the 麻豆社 network in the UK too. This worked well with the huge growth.</p>
<p>The technology stayed the same for some time; we spent the next year tweaking it as applications and sites were added. I built up an operations team - one that had started in early 1998 with a couple of contractors - to handle the day-to-day support needed by the content producers and developers. An important part of their work became security checking developers' code. The 麻豆社 was promoting internet use in the UK, bringing in new, naive and vulnerable users. The last thing we wanted was tabloids running "麻豆社 hacked" headlines and users being put off.</p>
<p>I was aiming for one team looking after operations on a technical and editorial (webmaster) level, but after hiring them, politics pulled them apart into two divisions - though they worked closely together.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="Kingswood Warren, August 1999" src="http://www.bbc.co.uk/blogs/bbcinternet/kingswoodgenerator.jpg" width="430" height="161" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">Kingswood Warren, August 1999 </p></div>
<p>The site was getting quite busy; armed with the traffic we'd generated, it was time to play the next level: our own .</p>
<p>So far we'd gone through ISPs for internet connectivity. Rather than use one to forward all of our traffic to all of the others, we had the option of delivering it to each of them directly - but that was only sensible in certain circumstances. For the UK, it meant that we had to be an ISP and join  where most exchanged traffic.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="See the network develop." src="http://www.bbc.co.uk/blogs/bbcinternet/ineto_20000504.png" width="430" height="400" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">See the network develop.  </p></div>
<p><em><small>Above is : see also ; ; ; ; ; </small></em></p>
<p>I'd been looking at this for some time, but finally in late 1999 found a way through the LINX membership rules that had prevented us so far. LINX worked well, and similar arrangements which we made in New York later expanded to EU and USA west coast. We still took transit from a couple of ISPs to cover failures and to reach ISPs not present at locations we'd built our network into - around 15% of the total traffic - as it was cheaper.</p>
<p>As other high traffic sites grew, many followed the same path to self-hosting.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="The Domecam" src="http://www.bbc.co.uk/blogs/bbcinternet/domecam.png" width="430" height="288" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">The Domecam </p></div>
<p>Having installed new infrastructure, we spent 1998-99 using it, taking on more content and applications with entertaining live webcasts. At Glastonbury, we fitted a remote control camera to a pole by the cow shed with visibility of the whole site (nobody wanted to clean that kit when it came back). We set up a  on a lighthouse -  - to watch the Millennium Dome being built; it lasted longer than the Dome. We did webcasts from all over - bird sanctuaries where BT delivered a phone line to a box in the middle of a field, and the Zambia desert for the 2001 eclipse, streaming live video via a satphone link. We set up Outside Broadcasts with a IP-over-satellite link so more webcasts could be done from around the UK.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="Plymouth webcam, May 2001; One Big Sunday, July 2001" src="http://www.bbc.co.uk/blogs/bbcinternet/20010xxxwebcams.jpg" width="430" height="472" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">Plymouth webcam, May 2001; One Big Sunday, July 2001 </p></div>
<p>Our  did need some attention. The computer suite was completely rewired; the now inadequate wall-mounted aircon replaced by large floor standing units. This was done with everything live: replacing the power wiring and main feeder meant running for a day on a mobile generator with lots of hired-in theatre lighting wiring around the floors.</p>
<p>An attempt to have a unified CPS didn't work out, though quite a lot of work went into testing packages.  reviewed the infrastructure delivery and operation reporting we were considerably underspending compared to market norms, which quelled thoughts of outsourcing it for a while.</p>
<p> reared up and distracted us for a while: we were certain that the infrastructure was fine, but to comply with 麻豆社 policy we had to upgrade systems. As a test, we left one unpatched. It was fine. The only problem was content: scripts displaying <em>19100</em> instead of <em>2000</em> on web pages. It was quickly fixed as we had most of the ops team on site for our own New Year's Eve party and to be around just in case the world failed as some predicted. We didn't use the special phone BT installed.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="Kingswood Warren, July 2001" src="http://www.bbc.co.uk/blogs/bbcinternet/kw_power_work.jpg" width="430" height="447" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">Kingswood Warren, July 2001 </p></div>
<p>In September 2001, I was sat in an operations meeting when the pager went off and didn't stop: something big was happening. There was a massive influx of traffic to the site - a , it seemed. Damion called us back: <em>"there was this plane..."</em>. We turned on a TV and saw a burning World Trade Center tower. Then another plane. Ops worked on keeping the servers happy, raising the webmaster and News to agree sheddable load. This was the first time, so it took a while to get a new  in place. Our New York server farm was two blocks from the WTC site; it survived but suffered as power failed. The dust eventually clogged the generators and there were problems getting in fuel. The only outage was in the days after; we covered that by moving all traffic to London. The sites were designed to operate as hot spares for each other. We had planned around London suffering at some point, but it was the opposite.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="麻豆社T's first birthday party, April 2002" src="http://www.bbc.co.uk/blogs/bbcinternet/bbct.jpg" width="430" height="252" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">麻豆社T's first birthday party, April 2002 </p></div>
<p>We resurrected  in 2002. Until then, we'd been providing audio in , originally downloadable until we had to disable that due to rights problems. All along, we'd been trying to promote  for access to our content, but sadly support was low. I'd wanted to use  for audio streaming in 1998. Some of our audio engineers were working on the standard, but the market was dominated by  promoting their own  instead.</p>
<p>When  started, it seemed that finally an open standard would emerge and be widely supported.  of streamed and downloadable programmes which some used on portable audio players - podcasts in all but name. Really, we needed to do  to make it easier for general use, but due its associations with piracy, it wasn't acceptable to the rights holders.</p>
<p>I had similar ambitions for  video <small>[]</small> as a common format which is now starting to take off many years later. To control costs and make content universally available we've always looked for common standards so that - as with TV and radio - the audience can choose which manufacturer or player to use and so that the content only needs to be produced once, rather than in many formats. There is a significant cost for additional formats that could go towards new services instead.</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="Packing to leave Kingswood, September 2002" src="http://www.bbc.co.uk/blogs/bbcinternet/kwopspackingformh.jpg" width="430" height="280" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">Packing to leave Kingswood, September 2002 </p></div>
<p>So far, I'd been running the department in Kingswood which housed the operations and development teams, master content servers, streaming audio and video coding and content ingest. Then, in 2002, 麻豆社 Technology formed, bought the remains of a dead dot.com and decided to move both to a site in Maidenhead. We lost half the staff in the move and new management was brought in to look after the combined operation, later to be sold to .</p>
<div class="imgCaptionCenter" style="text-align: center; display: block; "><img alt="The master content servers and stream encoding were at Kingswood until 2003" src="http://www.bbc.co.uk/blogs/bbcinternet/kwserverspremovetomh.jpg" width="430" height="191" class="mt-image-center" style="margin: 0 auto 5px;" /><p style="width:430px;font-size: 11px; color: rgb(102, 102, 102);margin: 0 auto 20px;">The master content servers and stream encoding were at Kingswood until 2003 </p></div>
<p>The infrastructure remained much the same after that, with obvious increases in capacity needed to keep up with demand. In 2007, to reduce costs, the USA network was closed down and all hosting consolidated over here...</p>
<p>...taking us back to where we started: commercial hosting in just the UK.</p>
<p><em>Brandon Butterworth is Principal Technologist, Kingswood Warren</em>.</p>]]></description>
         <dc:creator>Brandon Butterworth 
Brandon Butterworth
</dc:creator>
	<link>http://www.bbc.co.uk/blogs/bbcinternet/2007/12/brandons_history_of_bbc_on_the_2.html</link>
	<guid>http://www.bbc.co.uk/blogs/bbcinternet/2007/12/brandons_history_of_bbc_on_the_2.html</guid>
	<category>麻豆社 Online</category>
	<pubDate>Tue, 18 Dec 2007 13:27:53 +0000</pubDate>
</item>


</channel>
</rss>

 
