PDF [with Graphics]: P05 Energy
Irrespective of the resource used to produce energy, energy solutions generally are concerned with either of the two national energy systems: reticulated energy (grid electricity and reticulated gas) for industrial and domestic use and portable energy (gasoline, diesel, tanked natural gas). In each case, the energy issue is concerned with reducing GHG, and/or achieving greater output per unit input (efficiency), and/or blunting demand through frugality or better buildings and cars. Both of these energy systems are supported by infrastructures which need modification to accept new energy sources or new energy regimes.Alternatives
If CO2 could be removed from coal-fired emissions, the energy-GHG nexus is broken in one step. Current ideas for this CO2 capture – carbon sequestration – involves capturing the 3% to 12% CO2 of smokestack gases, compressing the gas to liquid for transport and then sequestering it in some place where it will remain indefinitely. This seems like sweeping detritus under the rug because it is. Where to “hide” it is the issue – deep cold oceans where it will remain solid because of the high pressure, or in exhausted aquifers, coal mines or oil wells are the main suggestions. Norway has for some years been pumping one million tonne of liquefied CO2 each year into depleted natural gas domes under the North Sea. An advantage of this or other emission capture processes is that SO2 (sulphur dioxide, the cause of acid rain), mercury and other undesirable emissions apart from GHG CO2 can also be removed. Sequestration, whatever the details and possible ecological hazards, will add cost to electricity generation. One way costs might be ameliorated is to sequester the CO2 into aging oil reserves to increase the pressure and hence the yield. One elegant advance on this method is to capture the CO2 before it gets to the smokestack. Following successful demonstrations in Algeria and Norway. BP and partners are building a 350MW power station in Scotland using this “decarbonised fuel”; natural gas is first “split” into hydrogen and CO2; The hydrogen provides clean fuel for power generation and the CO2 is piped directly for sequestration in North Sea oil reservoirs.Sustainable (“Alternative”) Electricity
Since commissioning of the first commercial nuclear power station at Calder Hall (Sellafield, UK) in August 1956, nuclear (fission) power has had mixed fortunes and a mixed press. The “nuclear debate” has been given new life now because nuclear electricity generation offers savings in GHG emissions over coal-burning and other combustion power stations and some argue that nuclear is a valid clean, green alternative – it emits nothing (with luck) and waste can be safely handled using synroc storage technology (created in 1978 in Australia). Others argue it is the least green alternative conceivable because of radiation risks before, during and after use, and vulnerability to terrorist attack. The unique property that some nuclear power technologies can be produce weapons-grade materials puts nuclear in a separate category. Current dependence on nuclear varies significantly among countries — France (79%), Germany (28%), Japan (28%), UK (20%), US (20%). Some countries such as Sweden are actively downscaling their dependence on nuclear generation, but during 2006 several countries announced their intention to implement or expand nuclear power – Indonesia, Egypt, Belarus, Argentina, Nigeria, Iran. There is now an array of “fourth generation” nuclear fission reactor designs addressing safety and cost issues, including the Pebble Bed Modular Reactor (PBMR – a South African initiative), the Gas-Turbine Modular Helium reactor (GT-MH – a Russian initiative partly aimed at consuming decommissioned weapons plutonium) and the International Reactor Innovative and Secure project (IRIS – a multi-nation consortium formed by Westinghouse). Also, in September 2006, DoE granted $8M for research into engineering “pre-conceptual design”, a full rethink of future nuclear plant design. Nuclear has a continued attraction as it offers small, self-contained power generation units that can be brought online and offline relatively quickly, ideal for cycling to meet daily peaks. Although safe nuclear-generated electricity may be more expensive than coal-fired stations, this ability to support peak demand – obviating the need to build more coal-fired capacity – is still attractive.
Nuclear Fusion
The ITER project (International Thermonuclear Experimental Reactor) comprising the EU’s EURATOM, Japan, China, India, South Korea, Russia, USA aims to demonstrate the scientific and technical feasibility of fusion power at the ITER device in Cadarache [France]. Fusion, if feasible in this application (a controlled hydrogen bomb), will provide benign and abundant electricity on a transcontinental scale. Nuclear fusion must be confused in no way with the fission reactors which have been in use since 1956.
Greener Portable Fuels
Oils Ain’t Oils
“Oil” comes in many different forms and this accounts for the frequent contradictions in forecast of reserves. “Light sweet crude”, low sulfur readily-refined oil has been the benchmark and most desired resource since the inception of the petroleum industry. This was the first class of oil to be exploited and it is now in shorter supply or has been exhausted in some places. The aggregate quality of crude oil is dropping. This coincides with the gradual tightening over decades in emission standards – of lead, sulphur, mercury – and both factors have put increasing obligations (and costs) on refiners Average sulphur content (the “sweet-sour” parameter) has increased from 0.9% to 1.4% over the last 20 years. Imported crudes are becoming heavier and more corrosive. The anecdotal shortage of refining capacity in the US is due to this convergence of tighter output specification (often varying by state and season) and declining feedstock quality. Adaptation often requires capital-intensive upgrade, replacement or addition to refinery plant.
Synthetic Crude Oil (Syncrude)
Generous estimates of petroleum reserves still available generally include resources such as oil sands (bitumens), shale oil, and extra heavy crude. Much of Canada’s reserves are in the form of oil sands; Venezuela’s estimated 1.36 trillion barrels petroleum deposits are largely Orinoco extra heavy crude, a high-sulphur oil. As crude oil prices rise, known technologies will be applied to produce syncrude (synthetic crude oil) and traditional petroleum end-products from these resources. If crude oil prices remain above around $30 per barrel bitumens (oil sands) and extra-heavy crude will be economic to refine. The viability of shale oil is less certain. Shale oil is largely kerogen, a “young” form of crude oil. It can be burned directly as a solid fuel in place of coal or can be processed into syncrude at the rate of about of 25 gallons (0.6 barrel) of syncrude per tonne of oil shale. There is an estimated 2.9 trillion barrels in syncrude in known shale oil deposits, about 750 billion barrels in the US – equivalent of about 100 years of current demand. Where’s the catch? The processing cost of shale oil require a crude oil price of around $70 to $95 per barrel to be competitive. Also, a shale oil industry has apocalyptic environmental impacts. For each 1 million barrel per day of syncrude production, mining and remediation of 500 million tons of rock is needed each year, and 3 million barrels of water are required each day.
Non-crude Portable Fuels
Coal-to-Liquids (CTL) is a proven technology largely used in the US but is competitive only when crude oil is above about $40 per barrel and the price for suitable quality coal is modest (about $1 to $2 per MBTU). The process is extremely dirty – there are challenges of waste disposal, water supply, and waste water disposal or recycling. CTL activity is sited in coal regions in the US mid-West and DoE forecasts the process will continue to be used, producing 1M to 2M barrel per day in 2030. Gas-to-Liquids (GTL) technology is more complex than oil refining; it converts natural gas into a range of petroleum fuels. The process is viable when the crude oil price is over about $25 per barrel and natural gas is in the range of $0.50 to $1.00 per MBTU.
Biomass-to-Liquids (BTL) originates from renewable sources, including wood waste, straw and agricultural waste, garbage, and sewage sludge. BTL fuels are several times more expensive to produce than gasoline or diesel with wholesale costs of around $3.35 per gallon now (a crude oil equivalent price of $80-$90 per barrel), but this is expected to drop to around $2.40 per gallon by 2020. There is no commercial BTL in the US but DOE commissioned some investigation from Bechtel in 1998. The world’s first commercial BTL plant, with a capacity of 4,000 barrels per day. is scheduled to come on line in Germany around 2008, with others to follow. BTL front-end technology is new and evolving and has parallels with cellulose ethanol process in its use of sophisticated enzymatic technologies. BTL in the short-term is limited to use as fuel extenders rather than primary fuels. In the long-term, in the absence of a major energy breakthrough such as fusion power, BTL may become a mainstream source of portable fuels.
Renewable Portable Fuels (Biofuels)
Biofuels are seen as a certain hope on the energy horizon but partisan positions often put an overly optimistic or overly pessimistic view. In all cases, projected costs should take account of the energy used in the fuel-making process, as well as any catalysts, other chemicals, and labor. Making fuels of the future from straw and similar materials has a labor-intensive component absent from the petroleum industry and, with agriculture as the main source, only solid planning will ensure a year-round supply of raw materials. Those processes restricted to using “waste” are noble causes but will have a tough time ensuring continuity of materials supply in a local area, and those which use agricultural crops directly (sugar cane, corn) are competing with food supply for land use and may produce unintended social consequences.
Biodiesel — “Biodiesel” is one of the most-used terms in relation to current rethinking of energy sources. Biodiesel can be produced from a wide range of “sustainable sources” — vegetable oils and animal fats; rapeseed and sunflower in Europe, soy oil in the US, and soy or palm oil in Asia. There have also been studies of this use for coconut oil. The oil feedstock is put through a well-established catalytic process of esterification with an alcohol (methanol or ethanol) to produce methyl or ethyl esters, and glycerin and fatty acids as by-products. The by-products have some value but less than the esterification cost of around $200 per tonne. Biodiesel has been in reliable use for over a century – mainly in stationery large-plant applications and during times of diesel shortage – but it is a stronger solvent than conventional diesel and can destroy fuel lines and other components not designed for it. The oils used all have a long-standing value as human or animal food. This and the cost of processing make biodiesel an expensive alternative to petroleum-based diesel oil. Although methyl esters have long been used as a component in soaps and detergents, it is only price competitive to diesel oil when oil prices are high. But the “renewable” nature of biodiesel rather than price has been a reason for interest. Governments may legislate use of some blend using biodiesel for import replacement reasons or green-motivated consumers may create increasing demand. Vehicle engines are now designed with the solvent properties of biodiesel in mind and to handle problems with quality variation and clogging that can cause damage to older engines. Typically a 20% biodiesel blend is the maximum recommended but some manufacturers now allow up to 100% biodiesel. Popular anecdotes that any vegetable oil will work in a diesel engine – such as filtered oil from deep fryers – is true to the extent that a wide range of substances will “burn” in the compression-ignition diesel but unprocessed oil wills eventually damage the engine.
Gas Fuels
Fossil gas fuels yield less GHG emissions (mainly CO2), hydrocarbons, nitrous oxides, particulates, and sulphur. Liquefied Petroleum Gas (LPG; “autogas”, “bottled gas”) is best compatible with gasoline (spark ignition) engines, with about 75% of the energy density of gasoline. It is generally propane (C3H8) and/or butane (C4H10), and other hydrocarbons (depending on source) and can be liquefied at normal temperatures. In contrast Liquefied Natural Gas (LNG) is mainly methane (CH4) which is best suited as a diesel substitute, but has an energy density of only about 60% of diesel. Where LPG can be compressed at normal temperature, LNG must be compressed (and stored) at around minus160°C and 8 bar pressure, a costly requirement and one limiting its use to heavy applications. The liquefaction process removes almost all impurities, producing almost 100% methane. The ratio of hydrogen to carbon in a hydrocarbon is a measure of how much CO2 will be produced; the higher the H:C ratio the better – hence methane (CH4) is a cleaner fuel than propane (C3H8) when fully combusted.
Some energy initiatives such as the use of landfill waste as an energy source have double benefits. Methane (“marsh gas”) is emitted by all rotting organic waste. It is a major GHG and landfills are the largest source of US methane emissions. Capturing this gas displaces the use of other fuels and prevents the methane joining the GHG load. The City of Memphis has operated a landfill gas project since September of 2004, displacing the use of more than 67M gallons of gasoline in its first two years of operation, according to the EPA.
Natural gas hydrates (NGH) are a yet-untapped source of gaseous hydrocarbons. NHG is generally methane trapped in sponge-like structures of watery ice. The US Geological Survey estimates there is 320,000 trillion ft3 of NGH in deep water offshore the US coast and around 600 trillion ft3 in Alaska’s North Slope. This NHG potential dwarfs US natural gas annual production of around 20 trillion ft3, and reserves of around 200 trillion ft3. Commercial exploitation of NGH has not been attempted and there is no pressure at present on the Alaska reserves while large natural gas deposits remain at Prudhoe Bay and elsewhere. Interest is likely to continue in the offshore deposits but all recovery methods presently envisaged (heating and/or pumping water into the deposit) are energy-intensive or problematic.
There has been a move to run vehicles on LPG in many countries because it is cheaper or has been made cheaper by government incentive towards slightly greener fuel. In the pattern of world energy usage it must be noted that LPG plays a crucial part in the lives of billions of people in the developing world as the ultimate in portable energy used for cooking, heating, refrigeration, and lighting. Coupled with modern small turbines as generators and water pumps, LPG is a strategic commodity on a global scale.
Greater Efficiency
Greener Buildings
With 20% of electricity used for heating and cooling and 16% for lighting, significant improvement in efficiencies in just these two areas would have impact of many GWh across the country. Work in solid-state lighting (SSL) already offers possible 50% savings in energy costs for lighting and the impact of insulation and glazing on heating-cooling costs are already well known, but DOE’s Zero Energy Homes (ZEH) program is attempting a “whole house” approach in systematic achievement of energy savings. Importantly, here as across the whole spectrum of energy issues, there is no “magic bullet”, no single technology that will alone save the day. Already, ZEH prototypes have shown that an energy-efficient building shell, efficient appliances, and the appropriate mix of solar water heating and photo-voltaic (PV) can produce a dwelling with near zero net energy purchases.
Lighting (domestic and commercial) consumes around 16% of US electricity and is a good target for improved efficiencies that will have widespread impact with noticeable effect on the energy bottom line. Solid-state lighting (SSL) – using light-emitting diode (LED), organic LED (OLED), polymer (PLED) technologies – offers promise of over 50% reduction in power consumption for equivalent light output. SSL promises efficacies of 150 to 200 lumens per watt, twice the efficacy of fluorescent lighting and 10 times the efficacy of incandescent lighting (the common Edison “globe”). Also, SSL units have typical lifetimes of 100,000 hours, the light is produced without heat and units can be incorporated architecturally in ways not possible with conventional lighting.
The AC/DC Problem
Electricity is electron flow — direct current (DC) – and all early work in electricity such as Edison’s work with electric illumination was concerned with these simple flows. But direct current does not travel well over significant distances and it was soon discovered that alternating current (AC), particularly in high voltages, does travel without equivalent losses. So electricity is routinely distributed from power stations as high voltage AC and converted using transformers down to domestic voltages, between 110V and 250V in various parts of the world. But scores of electronic devices in every office and home use low voltage DC, which is why millions of little black transformers now litter the power outlets of the world. Some significant energy could be saved if buildings had a reticulated DC circuit. A standard may arise which pipes DC (perhaps 18V or 24V) from a single efficient large transformer in each building.
© 2006, OSS Copyright – all rights reserved to Title, Concept, Format, & Original Content.
Copyright in items cited remains property of the respective copyright owners. 9
Greener Transport
As vehicles account for 40% of US energy use mostly drawn from imported crude oil, any improvement in vehicle efficiency has a direct effect on the US bottom line. Much has been done by mandate to clean up vehicle emissions in the last few decades but little impact has been made on absolute fuel use. Vehicle innovation take two forms – incremental improvement in present designs, and radical rethinking of power-plant (motor) and power-train (transmission) design. Over the last 20 years, motor vehicles have increased slightly in weight, have increased about 80% in power, and have improved fuel-consumption by about 20% to a typical 29 miles per gallon. There will be continuing incremental innovation in lightweight materials, aerodynamics, friction reduction, and low rolling-resistance tires. These improvements are marginal but they are independent of type of power-plant. Some innovation of conventional power-plant and power-train design are relatively low-tech and have been satisfactorily demonstrated in buses and heavy vehicles. These include dynamic energy transfer to flywheels or to hydraulic pressure reservoirs – braking energy is transferred to the storage system and taken back again to assist with starting off, resulting in up to 50% in fuel economy.
The greatest single radical change to vehicle thinking is a range of electric-powered designs. The electric motor is a highly desirable power plant – it has a low-parts count, is intrinsically efficient, is compact, low-maintenance, and gives the highest torque at greatest load (when starting). Compared to the properties of the electric motor, the internal combustion ignition engine is one of the greatest mistakes in history. The efficiencies of electric rail and light rail (streetcars, trams) is legendary and has never been bettered. The first generation electric automobiles had almost 100 years of innovation, from the 1830s into the 1920s, when Henry Ford’s mass production, and the availability of West Texas crude, killed them. Electric vehicles were still hand-made and were marketed to the well-to-do for town use; Ford’s vehicles were one-third the price and could travel the highways appearing all over the US. Then as now, the energy efficiency of the electric motor was no match for the challenge of providing a portable high-endurance source of electricity. The challenge is two-fold – the cost, weight, and design-life of the electric source, and the vehicle range before recharge. Three answers have emerged – the all-electric vehicle with the cell technology available ideally suited to urban travel, the hybrid gasoline-electric vehicle (first implemented in 1916) that charges its battery while under gasoline power, and the fuel-cell that provides continuous electricity fuelled typically from a tank of hydrogen fuel. The fuel-cell is not a battery of electric cells; it is a catalytic electrochemical device that produces electricity while fuel is provided. All else being equal, the hydrogen fuel-cell electric vehicle – often called the hydrogen car — has all the qualities of the vehicle of the future, and market penetration will certainly grow as fuel-cell costs drop and hydrogen filling stations spread. The first hydrogen filling station in the US was opened by BP in Michigan in October 2006; it manufacturers hydrogen on the premises using mains energy. DoE forecasts that the new vehicle sale of hybrids will grow from 0.5% now to around 9% by 2030, but sales of all-electric vehicles will continue to be almost non-existent at around 0.1% [sic] by 2030. Substantial decreases in the cost of electric vehicles may change those figures dramatically.
Futures
Infrastructure
The strategic plan of the US Climate Change Technology Program (CCTP) recognizes that transition away from GHG-emitting (and uncertain) fossil fuels to renewables will require “continued improvements in cost and performance of renewable technologies”, which is obvious but, most significantly for real progress, the plan calls for “shifts in the energy infrastructure to allow a more diverse mix of technologies to be delivered efficiently to consumers in forms they can readily use.” This means two things – energy infrastructures must link to “a portfolio of renewable energy technologies” in situ, and – most radically – the grid must fully accommodate two-way energy flows to and from local areas and individual consumers. Energy corporations must not now be in the business of selling electricity (or gas) anymore; they now must sell connectedness to a dynamic and smart grid that is evolving every year (or every day) into a greener network of resources. Reticulation itself is the new industry.
It is the mix of a “portfolio” of new energy technologies that brings a new order of energy security. The distributed nature of DARPA’s internet is the key to its ruggedness and distributed generation is the only possible answer for electricity grids of the future. Also, the distributed generation concept opens an entirely obvious new possibility in world development. For instance, micro-turbines fed on biomass gases (or reciprocating engines fed on cow dung) could bring electricity to clusters of Indian villages now, rather than waiting for massive investment in huge central coal-fired power stations and high-voltage distributors.
In its simplest forms, local generation of energy is not new – elegantly simple solar hot-water technology has saved millions of GWh during its decades of history; solar powered remote telephone exchanges and satellite uplinks has also for decades brought communication to remote regions throughout the world.
Distributed Generation
One raft of clean, green solutions for distributed energy come from unexpected sources – the jet engine, and an engine design dating from 1816. The laws of physics allow the turbine (“jet”) engine to scale very well from the very large to the very small, which is not possible with the internal combustion (gasoline or diesel) engine; hence, micro-turbines run from portable or fixed gas supply can generate electricity with high efficiency anywhere anytime. The Stirling external-combustion reciprocating engine is the Beta tape of the engine world. The whims of investment rather than technological supremacy allowed the gasoline engine to kill off not only the electric vehicle in the 1920s but proven mature technologies such as the reciprocating engine that is highly efficient, has few moving parts, and is open to a wide range of fuels.
Large scale gas turbines in commercial power generation waste around two-thirds of energy input through heat that is dissipated into the atmosphere. Cogeneration – that harnesses and delivers this heat as a valued service — is an approach that more than doubles energy efficiency and halves GHG emissions. Some large institutions now use cogeneration to provide off-grid electricity and heating (or refrigeration). Large commercial power plants are generally distant from populations; cogeneration applications based on small turbines can be installed in a matter of days in the center of population centers where the electricity and heating/cooling services are used. Micro-turbines – with or without cogeneration – are quiet, have a higher power density (power to weight) than piston engines, extremely low emissions and very few moving parts (sometimes just one). Some are designed to be air-cooled and can operate without lubricants or coolants. They can use a range of fuels — propane, diesel, kerosene, methane, or other biogases from landfills and sewage treatment plants. The transportable turbine generator can be brought to the source of the biogas and latched into the grid rather than needing the biogas to be somehow moved to a power plant. This cavalier fashion in which generators can be attached (and unattached) to the grid firstly depends on a regulatory framework that provides for that; there are no technical obstacles – modern power switching “black box” technology is highly sophisticated and inexpensive.
Dumb Grids and Smart Networks
In just 13 minutes the power grid of the 80,000-square-mile Canada-US Eastern Interconnection area was toast. – Steve Silberman, “The Energy Web”
The electricity distribution network – heavy-duty copper cable terminating in every home and office in the developed world – offers a high-technology convergence that is yet to get the attention it warrants. Broadband Over Power Lines (BPL) has been demonstrated for some years in places such as Australia and in April 2006 a regulatory framework was approved in California. BPL exploits very basic physics. For many decades, consumers have been able to opt to have electric water heating turned off during peak demand for a lower tariff. The signal switching heating circuits on and off is the crudest and earliest application using signaling over power lines. BPL at speeds of 12Mbps have already been commissioned and the theoretical maximum bandwidth for each consumer is many times higher. BPL will allow remote customers beyond ADSL telephone service or cable to have broadband services. In areas already serviced, BPL will provide a “third pipe” ensuring even keener competition for broadband services.
More importantly, BPL enables smart grid technologies that have been the subject a very detailed theoretical development. A smart-grid would allow domestic devices to talk back to the power grid; the consumer could set simple equipment, probably from their existing PC, to negotiate prices and qualities of electricity supply. Doubtless a deluge of BPL interfaces and software will appear once penetration reaches a critical point and will give the term “smart building” a real meaning. A new activity, hobby, obsession of “energy tuning” will emerge that will not only switch electrical devices but will decide which should be adjusted based on the prevailing tariff. If tariffs soar during peak load, BPL-based controllers may switch a device such as an air-conditioner off, or adjust its temperature setting. Conversely if a grid supplier sees demand slipping, they could tout for extra demand from controllers that are allowed to switch suppliers. When critical mass is reached, the presently dumb electricity grid starts to become a smart web-like (or web-based) network. BPL offers obvious price advantages to consumers but it also gives electricity suppliers the opportunity to smooth peaks and troughs in demand through real-time price “negotiation”. This would lead to the second generation use of the “smart grid”. Obviously, the consumer does not get just those electrons their chosen supplier sends to them but in aggregate the system does work like that. If enough consumers specify 60% of their requirement must come from green sources, the system will warn household controllers (or simply start switching things off) when the network aggregate demand reaches the level of supply. The market would become “perfect”.
Another issue concerns quality. For crude electric uses such as heating and lighting, the quality of the supply is secondary to continuity – dim lights are better than no lights – but there are an increasing range of manufacturing processes, and home-office requirements that are greatly inconvenienced by as few as one or two blackouts or brownouts each year. A growing demand for power conditioning is certain. Whether this will be manifested in sophisticated (and expensive) home-office systems or local / neighborhood systems remains to be seen. What is certain is that vast electric grids stepping high-voltage AC down to districts and then down again within each local area is simply unable to guarantee the level of quality that high-tech equipment needs. Lightning events, storms, or a road accident bringing down lines, all threaten the Goliath hub-and-spokes model of electric grid. Traditionally, electricity suppliers have had a supply goal of 99.9%, (“three-9s”) representing an outage of about nine hours in a year. In India, Iraq, and all of the developing world that goal is a distant dream; in the middle of Manhattan, or Tokyo, nine hours over two or three incidents a year is no longer tolerable. More importantly the quality of this supply is below specification for far less than 99.9% of the time. New goals in the electricity industry – and the high-tech equipment lobby – speak of “nine-9s” (99.9999999%) as the new reliability and quality standard. For practical purposes this is impossible to meet without a systematic decentralization of the grid. To achieve the next generation, electricity sub-stations and transformer points throughout the grid must be able to disentangle themselves from a grid crisis and continue to serve local areas with acceptable quality for a practical period of time. All of these possibilities need just one more “black box” in each building’s meter box, where the grid meets the consumer. This would be the intelligent junction for any or all of the following:
• tail-end from any local mains voltage generation;
• head end of building mains power circuits;
• head end for building DC circuit/s;
• head end for vehicle charging circuit;
• tail-end from the electricity grid (the “supply”).
But, above all, policy-makers must create regulatory framework that permits the nation’s grids to join the digital age and mandates standards and installation safety.
After some experimentation, the “energyplex” or “eco-industry park” concept will mature –suites of co-located industries will use “waste” energy (often heat) and “waste” output material of one operation will be an input for another; process water will be recycled. This will not motivated by clean, green sentiments but by cost savings.
Energy reticulation itself is the new industry.