About

KMi Stadium logo colour

On March 25, 1999, a KMi Stadium Webcast team traveled to the Wytch Farm BP Amoco oilfield in Dorset to assist the Water and Gas Control project in capturing and sharing expert knowledge with colleagues across the company. In two webcasts, each lasting approximately 40 minutes, three oilfield engineers spoke live from the floor of a working field stores shed via the KMi Stadium. Around 50 colleagues tuned in from their desks worldwide, spanning from Bogotá to Houston, London, and Aberdeen. Some attendees participated in both live sessions.

John Davies from the Wytch Farm field introduced the event, setting the context for the presentation, which focused on the use of inflatable tools in water and gas intervention. The main segment featured BP engineer Andrew Patterson from Aberdeen, who controlled a presentation streamed to remote participants. His session included slides, animations, simulations, and multimedia content showcasing the inflatable device. At the conclusion, Gordon Mackenzie, a Baker Oil Tools Senior Applications Specialist from Houston, joined the discussion to address questions about the downhole technology.

Nearly half of the remote audience engaged in the Q&A session. A replay was instantly made available for those unable to attend live and was later integrated into a significant legacy website on the BP intranet. This resource, enriched with further multimedia and support, allowed workers to continue exploring the topic.

Feedback on the event and the Stadium technologies was overwhelmingly positive. One Aberdeen participant, reflecting on the first session, remarked:

The webcast was an excellent example of how new technology had been used to distribute information, and I saw many potential applications for this approach in the future. I wondered how much it would have cost to have the same 35 people in one room, considering expenses—not to mention the man-hours!

Check out the replay

NB: This previously required Internet Explorer 3.02+ (with Shockwave 7, QuickTime 3+, and RealNetworks G2 plugged in. Get the PlugIns). This version was under 1 Meg and streamed at 34Kbps, making it suitable for a maximum 56K bandwidth.

What went on behind the scenes

The mezzanine level of the Holton Heath stores was turned into a multi media studio for the first ever interactive Web Cast on the BP Amoco intranet.

A Webcast crew from the Open University produced a live video stream which was encoded then relayed to a server in Sunbury before being split to a repeater server in Aberdeen to service a total of 50 or so clients.

Andrew Patterson (UTG Well Intervention Team) and Gordon MacKenzie (Baker Oil Tools) presented the Inflatables Webcast.

Andrew spoke to camera whilst in control of his own slides. Gordon reviewed questions and comments submitted by the audience via a text chat session during the event.

More about the Webcast Network Architecture

The AV setup

The AV was captured in Wytch Farm by a small outside broadcast team from the Open University.

The setup included 3 cameras and 3 microphones. The audio feeds were mixed and balanced via an audio mixing desk, and the video feeds were simply switched. One camera was a simple and inexpensive webcam; one was a domestic quality Hi8; and one was a broadcast tv quality camera. The latter was used as the main video source with the others providing alternate shots for variety. The webcam, for example was directed out into the stores area of the shed and was used before and after the presentation to give a live context to the interface (and on occasion showing a forklift moving around beside the presentation area)!

The AV Encoder

The AV feed was encoded locally on a Dual Pentium II (300Mz) via the Real Networks v5 encoder using their own streaming codecs. We aimed for a 34Kbps bitrate to fit reliably within a 56K baud pipe (made up from 8K audio, 26K video). This single 34K stream was sent over the BP Amoco intranet to an AV server in Sunbury, near London England.

The AV Servers

There were three AV servers deployed for this event, all of which were Pentium II (400Mz) NT servers running the RealNetworks v5 server (licensed for a maximum of 300 simultaneous users). The principle server in Sunbury received the encoded stream which it made available to all local networked users as a unicast (ie. Each user taking an individual 34K stream from the server). In addition the Sunbury server had been pre-configured to “split” its services to two further servers: one in Aberdeen Scotland, the other in Houston Texas. Both of these split servers was configured to be identical to the Sunbury machine, taking a single 34Kbps which was made available to local clients. The split architecture was required to avoid the risk of network traffic bottlenecks between the various segments of the BP Amoco intranet. On the day this architecture proved to be unproblematic, with one major exception! The machine deployed in Houston performed unresponsively during the morning rehearsals and so was excluded from the services offered live. Instead, the relatively small number of users in the Americas were seamlessly switched to the Sunbury server for the live event. Whilst this increased the risk of poor performance to those clients, in reality the BP Amoco transatlantic link was quite adequate to this traffic. The

Event Management Servers

In tandem with the AV services, the KMi Stadium architecture uses a range of management servers to direct non-AV event information to clients. In this webcast, the main sources of this information are about slides and chat. The slide information relates to the “state” of the presentation (eg. Which slide the speaker wants the audience to see); whilst the chat information relates to any text comments that are being made by participants. The “state” of chat and slides were captured on processor clients which were running locally in Wytch Farm on a dedicated Pentium II (400Mz). The management servers themselves were co-located with the AV services on each event server. In this event we used Macromedia Shockwave 7 Multiuser servers (licensed for 500 clients), which were configured to be dumb – simply taking instructions from the processor clients and forwarding these to all connected users. The data rate of management information was variable but very low, including, for example, a stay-alive ping from all connected users (to register their continuing interest in the event)! The biggest problem faced by event management is a function of its independence from the AV. The AV server technology we used in this event can introduce a substantial lag between encoding and decoding – a lag which is different in each case. In some cases the participant client is seeing and hearing something some 20 seconds after the presenter said it! Given that this is a broadcast technology – designed to be primarily 1 way – this lag is not evident to the audience. However, the event technology can be nearly synchronous – meaning that it is essential to introduce an artificial delay before any events are acted upon in the remote client. In this software the events are stacked and individually delayed in each local client to match the AV delay. On the day, local synchronisation was fairly good and stable – when the presenter said something, it happened for each remote client at about the right time! One feature of this service which is has been omitted from figure 2 (for simplicity) is that none of the MM multi user servers had any special status (cf the Sunbury AV server). Instead the control architecture mandates the processor clients on the event manager machine to keep each of the remote ‘split’ servers in synch with each other, independently and directly.

The clients

The base client used in the webcast was Microsoft Internet Explorer 3.01 (the standard browser of the BP Amoco Common Operating Environment). The browser required a small range of plug-in software to be added, including: Macromedia Shockwave 7; Quicktime 3; and Real Networks G2. This software was included in a BP Amoco automatic COE update, via the SMS software management system. All users who registered on the webcast web pages prior to the event received SMS packages which automatically installed the appropriate software. For those who did not wish to do so, some of the software was also set to auto-install on demand via the web page. The presentation client was set to automatically cache the slides used in the presentation, when the user first opened the relevant page. The Macromedia Shockwave Cache was used, to avoid the risk of the browser flushing its own cache before the presentation. For users on a slow bandwidth connection, the larger presentation assets (primarily the Quicktime movies) were set to stream to the user – so that they would be in the cache before being required but did not prevent the user starting the presentation. All these non-live assets, including movies and all the control software were bundled into a cache of less than one megabyte. The live audio and video feeds were streamed to the user on demand – requiring an open connection with a stable data rate, but little cache or disk.

More about the Webcast Interface

Like most of the KMi Stadia the interface consisted of the usual collection of simple regions, such as AV, Slides, Controls and Chat.

A small AV region was provided at the top left of the interface: before the webcast it showed an introductory video where the event organiser introduced the technology and context for the event; during the webcast this was switched to the live feed from the Wytch Farm oilfield; and afterwards it was switched to a replay of the live event. In all cases this streamed from the server to the client and was in synch with the other pieces.

The main control panel was directly beneath this video window. The key controls allowed users to start/stop the AV stream, adjust the settings, open the chat window, get detailed state information about the system and so on. In the interface there was a narrow chat input box at the bottom of the main slide pane. Users were able to type any comments or questions into this box and submit them throughout the presentation, without distracting from the slides region. To view the whole chat (including everyone else’s public comments) they could zoom this region to cover most of the slide pane.

The slide pane itself, was positioned to the right of the controls area, it was the largest area of the interface and was under the direct control of the presenter (during the live session). In addition to the usual graphic and bullet point slides the presentation also made use of more sophisticated media including a Quicktime VR object, a Quicktime movie, a simple section from a simulation and a number of Macromedia Flash animations.

People

Presenters

Andrew Patterson (BP Amoco)

Gordon Mackenzie (Baker Oil Tools)

John Davies (BP Amoco)

Chat Facilitator

Martin Geddes (Baker Oil Tools)

Technical Liaison

Harry Cassar (BP Amoco)

Management Observers

Kate Bell (BP Amoco)

Richard Fulleylove (BP Amoco)

Producer

Peter Scott (Open University)

BP Amoco Coordinator

David Stevens (BP Amoco)

Stadium Crew

Mike Lewis (Open University)

Tony Seminara (Open University)

AV Crew

Andrew Rix (Open University)

Owen Horn (Open University)