Jump menu

Main content |  back to top

Speeches and articles

Energy, technology and climate change: a new world

Speech given by Malcolm Brinded, Executive Director, Upstream International, and Simon Henry, Chief Financial Officer, Royal Dutch Shell plc, at the Churchill College, Cambridge, November 11, 2010.
Malcolm Brinded (top) and Simon Henry

Growing population levels and surging economic growth are driving rising energy demand across the world. But with greenhouse gas emissions continuing to rise, all countries must find more energy at a much-reduced cost to the environment in the coming decades. In this speech, Malcolm Brinded, Executive Director of Upstream International at Royal Dutch Shell, and Simon Henry, the company’s Chief Financial Officer – both Churchill graduates – discuss four ways Shell is using its technological and scientific expertise to help the world develop a secure and sustainable energy supply. First, advances in seismic technology that are allowing the company to open up resources from hard-to-reach locations. Second, new techniques in gas production that are opening up vast resources of natural gas, which is the quickest and cheapest way to cut CO2 emissions from the global power sector. Third, Shell’s development over more than three decades of the technology that converts natural gas into liquid fuels. And fourth, the company’s imminent move into the large-scale production of biofuels in Brazil.

Energy, technology and climate change: a new world

As Churchill graduates, we’re especially proud of the links between Shell and the college. They could hardly be stronger or more significant.
 
As you know, the college’s origins can be traced back to the early 1950s, and Winston Churchill’s desire to establish a postgraduate institute for engineers and scientists to rival the Massachusetts Institute of Technology. 
 
Shell played a crucial role in realising this aspiration in the form of Churchill College. John Oriel, who had been the General Manager of the Shell Refining and Marketing Company, was one of the two prime movers in the drive to establish the college, alongside Sir John Colville, Churchill’s Private Secretary.
 
In fact, thanks to his efforts, Shell pledged 100,000 pounds – the equivalent of around 2 million pounds today – to help set up the College. And Oriel became one the college’s founding fellows.
 
Over the subsequent half century, Shell and Churchill have maintained a strong relationship. For example, as Shell Industrial Fellow Commoners, members of industry would stay at Churchill while conducting research at Cambridge. Moreover, to mark Churchill’s Silver Jubilee in 1986, Shell funded a Lectureship in Control Engineering at the college.
 
And, of course, many Churchill alumni have joined Shell over this period – we’re not sure of the exact number, but we think it works out at about one every two years.
 
(Malcolm Brinded) With that in mind, we will focus on some of the ways in which Shell is using science and technology to help the world develop a secure and sustainable global energy system.
 
In Shell’s day-to-day work we draw upon all manner of scientific and technical disciplines, from materials science to advanced mathematics, from chemical and civil engineering to agricultural science – and an array of specialised fields in between.
 
And the need for these advanced skills will only grow more important as the energy landscape is reshaped by surging demand and global efforts to tackle CO2 emissions.
I’ll begin by describing these trends, before addressing two ways in which Shell is responding through the application of new technology.
 
First, I’ll focus on some recent advances that we’ve made in seismic technology that are allowing us to find oil resources that had earlier been concealed.
 
I’ll then discuss the ongoing boom in unconventional gas production. Here, technological advances are driving a massive expansion in the world’s gas resources. This matters because natural gas is the quickest and cheapest way to cut CO2 emissions from the global power sector.
 
(Simon Henry) I’ll then pick up the thread with another major breakthrough in the global gas market: the conversion of natural gas into liquid fuels – soon to begin on a large-scale at Shell’s massive $18-19 billion Pearl GTL plant in Qatar.
 
Next, I’ll describe some of the science and technology underpinning the expansion of the biofuels industry, which will be critical to tackling CO2 emissions in the transport sector – before finishing with a word on the broader geo-political context in which the energy industry takes its place.

The global energy challenge

Figure 1 – Rising energy demand

Figure 1 – Rising energy demand

(Malcolm Brinded) In the first half of the century, global demand for energy could double, according to the International Energy Agency, driven by a rising global population – 9 billion compared to today’s 6.5 billion or so – and especially by economic growth in the developing economies.  
 
China’s GDP growth is running at some 10% per annum, while India and Brazil are not far behind. And it’s this kind of growth that helps to explain why China’s gas consumption could treble by 2020.
 
To keep pace with rising demand, the world will need to invest heavily in all energy sources, from oil and natural gas, to biofuels, nuclear power, solar and wind. 
 
Not to do so will leave billions of people in the energy poverty and deprivation they face today – for example more than1.4 billion are without access to electricity (IEA), while nearly a billion people still use unsafe sources of drinking water (UN – 884 million).
 
At the same time, we must urgently tackle greenhouse gas emissions. According to the consensus of climate scientists, CO2 emissions should be limited to 450 parts per million to avoid levels of global warming with significant negative consequences.
 
On one estimate (Mauna Loa, Hawaii) they have now reached 390ppm – so just 60ppm to go – yet they continue to rise at an annual rate of 2ppm. The clock is clearly ticking.
 
Over time, cleaner energy sources will meet a growing share of demand, as global efforts to tackle climate change gather pace. For example, the Chinese government has announced a 40–45% voluntary reduction of carbon intensity per unit of GDP by 2020 compared to 2005. And in the US, the government has committed to an ambitious 17% reduction in CO2 emissions by 2020 compared to 2005.
 
But even so, fossil fuels will continue to meet the majority of global demand for decades. After all, there are significant technical and financial constraints to deploying alternative sources on a mass scale.
 
Our industry is very different to consumer electronics, where businesses are under pressure to develop and market new mobile phones, for example, within 18 months.
 
At Shell, we’ve researched all current energy types. And we found that in the 20th century, it took around 30 years for new energy sources and carriers to capture 1% of the market after commercial introduction. For example, the first liquefied natural gas plant came on-stream in 1964 in Algeria, using Shell technology. Since then the growth of LNG has been spectacular. But four decades later, the share of LNG in the global energy mix is still only 2%.
Yet we still think that by the middle of this century, up to 30% of the world’s energy could come from wind, solar and other renewable sources.
 
And that would represent a historic transformation from the 13% they represent today. In absolute terms that would represent an increase in output from renewable sources of more than 300%.  
 
But it would also mean that, even then, fossil fuels would still supply almost two-thirds of global energy.

So the first question is whether that can be provided.

Seismic technology

Figure 2 – Perdido project

Figure 2 – Perdido project

I’ve already described how long-term energy demand is rising. That includes demand for oil at least until 2030. But the natural decline in the production rates of developed fields will make keeping pace with this demand all the more difficult.

In fact, the combined effect of increasing demand and falling production rates means that the world will need to produce an additional 40 million barrels of oil a day by 2020. Forty million barrels a day is about four times what Saudi Arabia produces, or ten times what the UK and Norway together produce. Most of it will need to come from resources that haven’t even been found yet.

That is why the oil and gas industry continues to explore all over the world. Shell remains one of the world’s biggest explorers. In the last three years we have spent around $3 billion a year on exploration. And we are conducting seismic surveys and drilling wells in the 38 countries shown on this map.
 
Almost half of the world’s yet-to-be-found oil lies offshore, according to a comprehensive global assessment done by the US Geological Survey in 2000. For that reason, we’ve been developing technology to help us explore and operate safely in deeper and deeper water.

Some of this technology is contained in new, safer-design floating platforms like this Perdido installation – which started production in March this year in the deep waters of the Gulf of Mexico.

This is, in fact, the world’s deepest offshore producing platform, set in 2.2km water depth – that’s 50% deeper than the Macondo well site where BP had the Gulf of Mexico blowout.  Such a major platform – and the oil wells that supply it from the seabed – costs over $5 billion to build and install. So one thing’s for sure – we need to be certain we put it in the right place!

That’s why we spend our exploration dollars – trying to be sure we know where the oil and gas is.

Seismic surveys

And one of the key exploration operations we conduct is a seismic survey. It results in images of underground rock layers.
 
In an idealised situation the basics of marine seismic exploration are easy to grasp. A ship tows a source of sound waves that travel down into the layer-cake rock formations beneath the seabed. At each of the interfaces between layers the waves are reflected back to the surface, where they are picked up by pressure sensors strung out behind the same ship.
 
If the time between a seismic “shot” and the detection of its echoes is plotted for each reflection point, then the layers can be traced out. And if the speed of sound through each of the layers is known, then the image can be converted into a cross-sectional image of the earth.

Of course, the real world is not ideal. Rock strata are often faulted and deformed. And geophysicists don’t know the different speeds of sound in each layer; they have to estimate them.
 
Still, by the 1980s science, engineering, mathematics and computers had managed to produce images of the earth’s rock layers that were accurate enough in most situations.

Challenge in the Gulf of Mexico

But one prospect that tested the limits of marine seismic imaging was located in the deep rock layers under the Gulf of Mexico.

There sediments had been deposited on a thick and extensive bed of rock salt: the remains of repeatedly evaporated seas of primordial earth. Rock salt is a misnomer: over geological timescales it flows like putty, driven by the uneven weight of overlying sediment.
 
The salt had a tendency to rise in columns, forming mushroom-like canopies – some six kilometres high. The underside of such salt overhangs sometimes serves as structural “traps” where oil and gas can accumulate.
 
Conventional marine seismic surveys were thrown off track by another property of underground salt: the speed of sound in it can be twice as fast as that of the surrounding sediments. The salt structures of the Gulf thus refracted and diffracted seismic waves every which way. Imaging what lay below such a salt structure with seismic waves shot from directly above was a bit like looking through roughly textured glass blocks.
 
Still, we thought that we had sussed out the geological situation at the Deimos prospect at a water depth of between 1000 and 1300 metres in the Gulf of Mexico. And in 2004 we drilled an exploration well that we thought had a 70% chance of striking oil.
 
The well – which cost in the order of $100 million – was “dry”: it did not show signs of oil or gas. Geologists were at a loss to explain the absence of the reservoir, and the exploration programme in this location was suspended.

“All-azimuth” seismic survey

A few years later, however, the Deimos area was once again slated for a seismic survey as part of a redevelopment project involving a couple of nearby fields, including the prolific Mars field. We took advantage of the opportunity to make clearer seismic images. One way to do this would be to let the seismic waves strike the area under the Deimos salt structure more from the side rather than from directly above. And not just from one side, but from all sides.

Think of a torch shining obliquely on a rough surface. Ridges and grooves perpendicular to the light beam cast shadows that can perplex the viewer. But by illuminating the surface from various directions, the orientations of the ridges and grooves can be better determined. To make such an “all-azimuth” seismic survey possible, we dispensed with the towed sensors. Instead, we set the sensors down on the ocean bottom – 807 of them, to be precise – with the aid of deep-diving remotely operated vehicles.

We also processed the acquired data with more intensive computations that improved upon earlier mathematical approximations for solving the equations governing seismic waves. These computations no longer made the simplifying assumption that the seismic waves could be modelled like light rays in geometric optics, for example. Nor did they ignore various wave-interference phenomena. This type of processing only became possible with the recent increases in the number-crunching power of computers. Their number of calculations per second has increased a few thousand-fold over the past decade.

Our ocean-bottom all-azimuth survey – only the second one done in deep water and at a total cost of some $28 million – did indeed reveal a clearer picture. It showed why the 2004 well had missed the oil: it was drilled with the wrong geologic model in mind. Nothing in the earlier fuzzy seismic images had suggested that geological faults were found in the relatively young Miocene formations underlying the salt. The new seismic images, however, clearly showed faulting to be an important feature of the sub-salt formations. They also showed where the oil was likely to be found.

On the basis of this updated imaging, another exploration well drilled in 2009 confirmed the existence of the reservoir. We now call it the West Boreas field. Another discovery – South Deimos – was also found nearby on the basis of the new seismic imaging. Together, they add the equivalent of more than150 million barrels of oil to the already prolific Mars Basin area. That would mean over $12 billion of revenues at current oil prices.
 
And so we have just announced our decision to develop these two fields in the go-ahead of the Mars B development project, installing another giant 21,000-tonne floating platform designed to process approximately 100,000 barrels of oil equivalent per day from as many as 24 subsea wells in water 1,000 metres deep. Six of those wells will come from West Boreas and South Deimos.

The natural gas revolution

I’ll now switch focus to natural gas, which as you can see, has grown dramatically in importance as a global source of energy since the middle of the twentieth century. And recent technological advances are making an even bigger impact than deep-water seismic.

Natural gas advantages

This is great news for the world because the quickest and cheapest way to cut CO2 emissions from the global power sector is to grow the presence of natural gas at the expense of coal. Natural gas is the cleanest-burning fossil fuel: modern gas plants emit half the CO2 of modern coal plants, and 60-70% less CO2 than old coal plants, of which there are still hundreds in operation today in China, North America and Europe.

Natural gas capacity is also considerably faster and cheaper to install than other new build sources of electricity. And gas-fired power stations can be switched on and off with relative ease, making them ideal allies of the intermittent power generated by wind turbines and solar panels.

Advances in “tight” gas production

Figure 3: Fracking in the field

Figure 3: Fracking in the field

Only a few years ago, it looked as if North America’s domestic gas production would decline. But in fact its gas production has increased dramatically. The reason is that drilling companies managed to the unlock the gas resources in rock so “tight” that the gas seeps through the rock – at best – about a thousand times slower than it would through an ordinary reservoir. They developed a method to crack open the rock to allow the gas to flow into wells much more freely.
 
These rock-cracking operations, known as hydraulic fracturing or “fracking”, involve pumping liquids into a sealed well section under such high pressure that the rock around the well splits open. As the crack widens and lengthens, it is filled with a granular material known as proppant, through which gas can readily flow. The proppant keeps the crack open after the pressure is released.
 
Fracking – not only of gas wells but also of oil wells – has actually been practiced in the field for several decades. But these operations did not always result in increased production. Sometimes the proppant plugged up the crack, stunting its growth. Sometimes the cracks curled around the well before extending into the formation, creating gas-flow detours. And the power and pressure available were sometimes not up to the job.

“Fracking” improvements

But in the last decade, major leaps have been made in the size and efficiency of the fracking kit. The pieces of kit can include multiple 1,500-horsepower pumps that can achieve rock-fracturing pressures as high as 1,000 bar and proppant-carrying flow rates as high as 260 litres per second. (For comparison: a typical fracking pump is more powerful than a Formula 1 engine and moves liquids at 20 times the rate of the race car’s pit-stop refuelling.)
 
For tight gas to reach its full potential, more targeted ways of planning, executing and monitoring these fracking operations have also had to be developed.
 
In particular, our study of fluid and rock mechanics under the high-pressure conditions of fracking has enabled us to program computer applications that predict where the rock is most likely to yield and how long and wide the crack will grow. They model the complex interplay of fluid pressure and flow with rock stresses and strains in three dimensions. We thus have a good idea of how best to orient a well in the reservoir and what the optimal combination of pressure, flow rate and proppant volume is for fracking it even before we begin drilling the well.
 
As we undertake fracking operations, around 20 micro-seismic sensors in a nearby well listen to the popping and creaking of growing cracks. The sounds enable us to map out the contours of a growing crack and adjust the pump programme accordingly.
 
We also have moved from fracking predominantly vertical wells to fracking mainly horizontal wells. Only a small segment of a vertical well – perhaps 15 metres – intersects the gas-bearing zone. But with a horizontal well, a thousand metres or more can lie entirely within the gas-bearing zone, providing plenty of potential locations for fracturing.

And if we align the wellbore axis to the minimum horizontal stress, then the fractures should sprout around the wellbore, yielding the highest productivity increase.
 
At Groundbirch in western Canada we have gone from fracking at 300-metre intervals to fracking at 100-metre intervals. And this year we have reduced the spacing to 50 metres.
 
A full-scale tight gas development takes hundreds of wells, each with multiple fractures. The key to a profitable development is to use the learning curves from these large-scale drilling and fracking programmes to drive down costs.
 
Our operations at Pinedale, in the US state of Wyoming, are a great example. In 2002 it took us an average of 60 days to drill a well there. Today, an average well takes just over 25 days – 60% quicker. And that’s to a depth of over 4,500 metres.
 
The other thing to note from the slide is that we are learning faster – the later curves are steepening. At Groundbirch, which we acquired in 2008 as part of our purchase of the Canadian company Duvernay, it took us only three years rather than eight to achieve nearly the same improvement as at Pinedale.
Being able to drill and frack wells more quickly gives us greater flexibility to adjust operations according to the prevailing economics, our capital resources and other extraneous constraints, such as lease expiry dates.

We also lower temperature-sensitive fibre-optic strands to monitor production. The fibre-optic readings can confirm the inflow of gas. As the gas enters the well, it expands; and as it expands, it cools. (This phenomenon will be familiar to some of you as the Joule-Thomson effect.) Cooler zones thus show where gas is flowing freely into the well. If not enough gas is flowing into the well, we can frack again.

The broader implications

So what does all this mean?
 
In financial terms, we reckon that an annual investment of about $4 billion between 2011 and 2015 – mostly for drilling and fracking wells – will result in Shell having a total of around 85 million cubic metres per day of production from our North American tight gas operations in five years’ time. That’s almost three times what we produced in 2009, and enough gas to provide power to around 13.6 million American homes.
 
At the current US wholesale gas price of about $4 per million BTU (which is roughly one half of the average wholesale UK price), that production yields an annual operational cash flow of more than $3.5 billion.
 
At a broader level, the IEA thinks that the world now has enough technically recoverable gas resources for 250 years at current production rates. And the race is now on to unlock unconventional gas resources in other parts of the world, including China.
 
Thus the global gas market is expanding rapidly on the back of these technological advances. That will accelerate the pace of global CO2 reductions as new gas-fired power reduces the need for more coal.

Gas-to-liquids technology

Figure 4: Pearl GTL – nearing completion

Figure 4: Pearl GTL – nearing completion

(Simon Henry) I’ll focus on another important innovation in the global gas market: the large-scale conversion of natural gas into liquid fuels, lubricants and chemical feedstocks. This will carry gas into new markets, and create cleaner products.
 
As I mentioned earlier, our flagship project in Qatar, Pearl GTL, is nearing completion. When finished, it will be the world’s first world-scale GTL plant – that is, on a scale equivalent to a large oil refinery. At one point, more than 50,000 workers from 60 nations were at work on a site the size of 350 football fields, one of the world’s largest industrial developments.
 
Pearl will produce enough GTL gasoil to fill over 160,000 cars a day and enough synthetic base oil each year to make lubricants for more than 225 million cars.
 
Last year, we secured approval for the use of a GTL kerosene blend in commercial aircraft, only the fourth time in 100 years of aviation history that a new fuel has been approved.
 
Most of the GTL gasoil will be blended into the global diesel pool. But when blended with conventional diesel in high concentrations, it can also help to tackle the pollution that dogs many of the world’s cities. It burns with lower sulphur dioxide, nitrogen oxides and particulate emissions than conventional oil-based diesel. That’s been proved by trials of 100% GTL gasoil in several urban bus and taxi fleets.
 
GTL gasoil has also proved itself in altogether harsher conditions. It helped to power the winning Audi car in the Le Mans 24 hour race in France in 2006, 2007 and 2008 as part of a specially designed diesel fuel. In fact, 2006 was the first time a diesel powered car had won what is reputed to be the world’s toughest car race.
 
Let me give a brief overview of how we will create all these products at our Pearl GTL plant.
 
Put simply, our GTL process converts natural gas into synthesis gas, a mixture of hydrogen and carbon monoxide, which, with the help of catalysts, is then converted into liquid hydrocarbon products.

Cleaning up the gas

Pearl is linked to the world’s largest single gas field, the North Field, which stretches from Qatar’s coast out into the Gulf. The field contains some 25 trillion cubic metres of gas, about 15% of worldwide gas resources. The production process begins with the transporting of this gas to the shore, through pipelines.
 
In gas-liquid separation and clean-up units, we then remove all of the naturally occurring hydrocarbon liquids, such as condensates and liquefied petroleum gas, as well as ethane. We also remove contaminants like sulphur.
 
This leaves methane, which is, of course, the simplest hydrocarbon, made up of one carbon atom and four hydrogen atoms (CH4). Next begins the process of conversion, which takes place over three stages. Uniquely, Shell’s technology covers all three stages.
 
First, the pure natural gas is partially oxidised with pure oxygen to make the synthesis gas. This is done at temperatures of up to 1,300 degrees Celsius and under high pressure.

The catalysts: converting the syngas to wax

The second stage is the heart of the GTL process. We pass the hydrogen and carbon monoxide mixture through the Fischer-Tropsch reactors, thereby converting it into both long chain wax molecules (a liquid wax, but which becomes solid at room temperature) and water.
 
I should explain that these reactors take their name from two chemists, Franz Fischer, a German, and his Czech-born partner, Hans Tropsch, who in the 1920s developed a chemical process to produce synthetic liquid fuels from coal.
 
They published their first findings in 1923, showing how they had produced a synthetic gas from coal, before using an alkali-iron catalyst to convert it to hydrocarbons.
 
Within our Fischer-Tropsch reactors at Pearl, the conversion is triggered by carefully designed cobalt based catalysts.
 
Over three decades, we have developed catalysts that are able to support a large-scale and efficient commercial operation. What that means is that they are highly active, yet able to remain stable. And they also have high levels of what we call “selectivity” – that is the ability to produce the highest yields of the molecules representing the desired products.
 
To develop these catalysts, we had to get their precise chemical composition right – although cobalt is the main element, they also include a number of other metals. And we had to ensure that these metals were correctly dispersed on the surface of the catalysts.
 
Critical to all this is the size of the cobalt particles. At the outset of our GTL work, in the 1970s and early 80s, the received wisdom was that the most active catalysts, and thus the most effective, were those with the smallest possible cobalt particles.
 
But by working in partnership with several universities, we discovered that this is not the case: in fact, it became clear that if the particles were too small their level of activity actually declined. That’s because the most active particles consist of certain configurations of cobalt atoms, which themselves can only be formed if there are enough atoms. Thus the main challenge in producing effective catalysts is ensuring that as many of the cobalt particles as possible are of the right size – which is at the nano-scale.
 
Another focus of our research has been on ensuring that the catalysts remain highly active, even after lengthy periods in use. This matters because, over time, their level of activity can slow as the cobalt becomes agglomerated or “sintered”, which increases the size of the particles.

We’ve developed several techniques to counter the deactivation of the catalysts, including conditioning the synthesis gas and regenerating the catalysts in-situ. 
 
Our catalysts are the result of three decades of close collaboration between our laboratories, our pilot plant in Amsterdam and the world’s first commercial GTL plant of its type, at Bintulu in Malaysia, which we opened in 1993.
 
To produce the thousands of tonnes of catalysts we need at Pearl GTL, we will have spent around four years using dedicated facilities in full-time production. Although, as I’ve said, the catalysts consist of fine, nano-scale pores, they cover an enormous surface area. And the catalysts will be fed into 24 Fischer-Tropsch reactors, each of which weighs about 1,200 tonnes and contains tens of thousands of tubes for the catalysts.

Creating the final products

In the third stage of the GTL process, the long-chain liquid wax is refined into the final products.
 
In chemistry terms, what we are doing is changing the molecular structure of the syncrude. To do this, we use cracking to convert the long carbon chain molecules into smaller chain molecules. And we convert them into branch chain molecules by a process of “isomerisation”.
 
After processing in the hydro-cracker, the resulting syncrude is fed into a distillation column or synthetic crude distiller, where it is separated into various fractions according to their boiling points.
 
Thus we are left with our products, ready for customers across the world.

Biofuels

I’ll now describe another way in which we are using technology to satisfy rising demand for cleaner transport fuels: biofuels.
 
Indeed at Shell, we believe that, of all the low-carbon transport fuels, biofuels can make the biggest contribution to tackling CO2 emissions in the next two decades. We expect their share of the road transport fuel mix to increase from somewhere approaching 3% today to around the 9% mark by 2030. 

Cosan

Shell is already one of the world’s largest distributors of transport biofuels. And we are now moving into production, further to the recent agreement of a proposed $12 billion dollar joint venture with Cosan, Brazil’s largest biofuels producer.
 
This will allow us to produce ethanol from Brazilian sugarcane, which can reduce fuel-related emissions by between 70% and 90% compared to standard petrol.
 
Yet there are concerns in some quarters about the sustainability of the biofuels industry, not least its impact on areas rich in biodiversity and the conditions of its workforce. So I’ll describe how Cosan uses its technological expertise not just to boost production, but also to improve the sustainability of its operations.
 
I should first mention that the main area of sugarcane production in south-central Brazil is some 2,000 kilometres from the fringes of the Amazon Rainforest. And sugarcane grown for ethanol in Brazil uses less than 1.5% of the country’s arable land. There are strict legal limits on the expansion of sugarcane crops into new areas.
 
Cosan is fast phasing out manual cane-cutting. In fact, more than 60% of its sugarcane is harvested by mechanical cutters.
 
These machines can harvest some 600 tonnes of sugarcane in a day, depending on the location – around 60 times the amount typically harvested by a single cane-cutter working manually. That reduces Cosan’s reliance on manual workers, while creating some new jobs in the operation and maintenance of this machinery.

Geographical information system

Cosan’s highly advanced geographical information system is also at the heart of its success. The system processes and interprets vast quantities of information about all aspects of Cosan’s 800,000 hectares of sugar cane – including the topography of the fields, data about soil nutrients and pest infestation levels.
 
If you think of 800,000 hectares as a circle with a radius of around 30 miles, it would, if centred here on the College, stretch as far as Bury St Edmonds and Bedford to the East and West, and to Peterborough and Bishop’s Stortford to the north and south. That is a lot of sugarcane!
 
The geographical information system draws its data from all manner of sources, including government soil records, public weather stations, and, indeed, satellite imaging of the company’s crops. The company also produces detailed climate reports, based on records that go back 30 years.
 
Complex algorithms then convert all this data into digital maps that display detailed information about all 800,000 hectares. Cosan has been developing the system since 2003, drawing on advanced skills in mathematical modelling, the agricultural sciences, informatics and statistics.
 
And it brings the company several powerful advantages. Cosan can make extremely accurate predictions about its likely crop yields. And it is able to identify and rectify any problems with its crops quickly.
 
Moreover, the system also produces significant environmental benefits. Because it records data about soil nutrients and pest infestation levels, it can calculate the exact amount of fertilisers and herbicides needed at each site. As a result, Cosan can limit its use of these chemicals to what is strictly necessary – something that would otherwise be impossible across such vast and varied tracts of land.
 
To further limit its use of pesticides, Cosan uses natural pest control. For example, to counter the moth caterpillars that prey on its sugarcane, Cosan breeds small wasps, the caterpillars’ natural enemy. It then releases the wasps among its crops, which kill the caterpillars by laying eggs in their larvae.
 
All told, some 40 million wasps are released every month – but only when the company judges that they are really necessary. And because the wasps have a lifecycle of just 4-6 days, they are unable to spread far beyond the sugarcane fields. In fact, the geographical information system also guides decisions about when and where these wasps are used.

Co-generation

Figure 5: Costa Pinto Mill

Figure 5: Costa Pinto Mill

Cosan’s sugarcane is also helping to tackle CO2 emissions beyond the transport sector. For many years, Cosan has used co-generation plants to produce electricity from the residue of its crushed sugar cane. This provides the company with a cheap and low-carbon source of power. In turn, this has boosted the company’s profitability, while significantly reducing the CO2 emitted in the course of its biofuels production.
 
The company is now also supplying this electricity to Brazil’s national grid, thanks to high pressure boilers at its co-generation plants.

For example, at the Costa Pinto mill two high-pressure boilers together generate 66MW of electricity. One-third of this is used to power Cosan’s operations, and the remaining two-thirds are sold to the Brazilian national grid.
 
In late September the company opened a new sugar and ethanol mill in the city of Caarapo, with an annual co-generation capacity of 76MW. To put this in perspective, Cosan estimates that this would be enough to supply a city of half a million people.
 
Thus Cosan is also helping Brazil to develop a secure and sustainable power sector.

Geopolitics: supply security

By way of conclusion, I’ll finish with a word on the broader geo-political context in which the industry takes its place – a key feature of the industry ever since Winston Curchill decided to convert the Royal Navy from coal to oil just over 100 years ago.
 
I studied Maths. My daughter is now studying Politics and International Relations – perhaps rather more relevant to my role today.
 
A good example is the expansion of the global gas market on the back of the technological advances described by Malcolm. This has many energy security implications.
 
For one thing, the opening up of so much unconventional gas in North America means that the continent will be self-sufficient for decades to come. This is just a few years after it feared that its production would decline.
 
The growth of unconventional gas is acting in concert with the expansion of the market for liquefied natural gas – gas that has been supercooled so that it can be transported by tanker in liquid form. LNG is growing fast on the back of new supplies and new customers – in fact at three times the rate of the overall gas market.
 
That strengthens supply security because LNG allows supplies to follow demand as it shifts and fluctuates around the world. After all, you can’t easily change the supply source for a pipeline but you certainly can for an LNG import terminal.
 
This is important for the UK and other EU countries facing declining production from their domestic gas sources. The world’s growing LNG network will allow them to buy their gas from a diverse range of sources.
 
By 2020, LNG is likely to account for nearly 30% of Europe’s gas needs. The number of regas facilities in Europe is growing extremely fast. In practice this means competition between indigenous supplies, Russia, Africa and the Middle East. The competition to build new facilities and pipelines, while developing the market for such an important fuel, are challenges in which Shell can and does play a significant role.

National oil companies

When thinking about the geopolitical context, the rise of the national oil companies is now key – these companies are owned or controlled by governments.
 
In recent decades, the national oil companies have risen to prominence, as major resource holding countries have – understandably – sought to obtain full value for their natural resources.
 
This has transformed the energy industry. Our ability to win important business opportunities now hinges on our ability to develop productive relationships with these resource holders. 
 
At Shell, we have the utmost respect for the NOCs – their skills, their investment in research and development and their stewardship of their national energy resources in best interests of their countries.
 
But we believe that we have much to offer them, especially our scientific, technical and management expertise.  For example, Malcolm’s story on unconventional gas production.
 
This is happening in China, where the government has thrown its weight behind natural gas. It aims to more than double the gas share of the primary energy mix to around the 8-10% mark by 2020. China is seeking to import gas from the Caspian region, from Russia, via LNG and also to develop its own production.
 
New pipelines are being built. This will create a more flexible and integrated gas market.  Ultimately demand will be driven by security and cost of supply. And Shell is helping the country’s national oil companies to tap a significant unconventional gas resource base.
 
We already operate the Changbei tight gas field in Shaanxi Province under a production sharing agreement with the China National Petroleum Corporation. It supplies gas to Beijing and other cities in eastern China.With CNPC we are also appraising and developing unconventional gas resources elsewhere in the country.
 
And our partnership is now extending beyond China’s shores. Together, we have purchased Arrow Energy, an Australian gas company, in a $3.2 billion deal. The joint venture plans to convert coal bed methane – an abundant unconventional gas source – to LNG for export to China.
 
So it’s a great example of how strong relationships between Shell and national oil companies can drive advances in gas production.

Conclusion

That is an appropriate place to finish.
 
This evening, we’ve shown how Shell’s scientific and technical experts are already doing much to drive progress towards a secure and sustainable global energy system.
 
I hope we’ve also shown that Shell’s need for highly capable chemists, geologists, engineers, and mathematicians is only growing stronger.
 
Indeed, without the rigorous grounding in the hard sciences provided by Cambridge and the world’s other top universities, it will be impossible to meet the global energy challenge.
 
So we very much look forward to welcoming more Churchill graduates to our ranks in the future, further strengthening the relationship between Shell and our great college.

Thank you.