Thursday, November 27, 2014
Could hemp nanosheets topple graphene for making the ideal supercapacitor
David Mitlin, Ph.D., explains that supercapacitors are energy storage devices that have huge potential to transform the way future electronics are powered. Unlike today’s rechargeable batteries, which sip up energy over several hours, supercapacitors can charge and discharge within seconds. But they normally can’t store nearly as much energy as batteries, an important property known as energy density. One approach researchers are taking to boost supercapacitors’ energy density is to design better electrodes. Mitlin’s team has figured out how to make them from certain hemp fibers — and they can hold as much energy as the current top contender: graphene.“Our device’s electrochemical performance is on par with or better than graphene-based devices,” Mitlin says. “The key advantage is that our electrodes are made from biowaste using a simple process, and therefore, are much cheaper than graphene.”
The race toward the ideal supercapacitor has largely focused on graphene — a strong, light material made of atom-thick layers of carbon, which when stacked, can be made into electrodes. Scientists are investigating how they can take advantage of graphene’s unique properties to build better solar cells, water filtration systems, touch-screen technology, as well as batteries and supercapacitors. The problem is it’s expensive.
Mitlin’s group decided to see if they could make graphene-like carbons from hemp bast fibers. The fibers come from the inner bark of the plant and often are discarded from Canada’s fast-growing industries that use hemp for clothing, construction materials and other products. The U.S. could soon become another supplier of bast. It now allows limited cultivation of hemp, which unlike its close cousin, does not induce highs.
Is time up for Australias uranium industry
IN THE EARLY HOURS of December 7, a crack appeared in a large leach tank in the processing area of the Ranger uranium mine in Kakadu National Park. The area was evacuated, the tank completely failed, the containment system was inadequate and one million litres of highly acidic uranium slurry went sliding downhill — taking Energy Resources of Australias credibility with it.The spill has left traditional owners who live and rely on creeks only kilometres downstream angry and "sick with worry" and raised profound concerns about the management culture and integrity of infrastructure at the mine.
Operations at Ranger are now halted. The mine operates inside Kakadu National Park — Australias largest park and a dual World Heritage listed region. It, and its people, deserve the highest standards of protection, but sadly Ranger is a long way short of this.
The Australian uranium industry has long been a source of trouble. Now it is increasingly in trouble. The commodity price has collapsed, projects across the country have been stalled, deferred or scrapped and the recent Kakadu spill has again raised community attention and concern.
At least the absence of a nuclear power industry in Australia means we dont have stories emerging like this one from the US - U.S. Dumped Tens of Thousands of Steel Drums Containing Atomic Waste Off Coastlines .
More than four decades after the U.S. halted a controversial ocean dumping program, the country is facing a mostly forgotten Cold War legacy in its waters: tens of thousands of steel drums of atomic waste.From 1946 to 1970, federal records show, 55-gallon drums and other containers of nuclear waste were pitched into the Atlantic and Pacific at dozens of sites off California, Massachusetts and a handful of other states. Much of the trash came from government-related work, ranging from mildly contaminated lab coats to waste from the country’s effort to build nuclear weapons.
Federal officials have long maintained that, despite some leakage from containers, there isn’t evidence of damage to the wider ocean environment or threats to public health through contamination of seafood. But a Wall Street Journal review of decades of federal and other records found unanswered questions about a dumping program once labeled “seriously substandard” by a senior Environmental Protection Agency official…
Monday, November 24, 2014
Worldwide Trends for Going Green
It takes time for any new product or action to develop and spread throughout the globe. "Going green" began many years ago but has only recently come to the very forefront of our minds and our surroundings.
The "plastic bag movement" is a prime example of this gradual change. reuseit.com has tracked the development since 2002 in Canada, the United States, Australia, Taiwan, India, Ireland, and further. Did you know Switzerland is a leader not only in the reusable bag movement but in recycled PET (PolyEthylene Terephtalate) as well? Over 82% of PET sold in Switzerland is recycled. Learn more about going green from reuseit.com below:
Thank you for taking the time to learn more about renewable energy! Knowledge Is Power If there is something else youd like to know write to us at info@endeavorscorp.com and well do our best to address it for you!
Friday, October 31, 2014
A Second Life for the Electric Car Battery
As I wrote in a recent Times article on electric car batteries, scientists are expecting big breakthroughs in battery technology over the next five years that will increase the range of electric cars while reducing their cost. But even with these advances, researchers acknowledge that any rechargeable battery will gradually lose its capacity to store energy after repeated cycles of charging and discharging.
Once storage capacity falls below a certain level, the battery can no longer provide the range that electric car owners will expect, according to Micky Bly, the executive director of global battery, electric vehicle and hybrid engineering at General Motors. For its new Chevy Volt, GM expects that level to be around 60 to 65 percent of the battery’s original capacity, he said in a telephone interview.
At the same time, with most of a battery’s useful life still intact, automakers anticipate that it could serve other, less demanding purposes than powering a few thousand pounds of car.
A number of projects and new ventures are already under way to explore second-life applications for lithium-ion batteries. G.M. has announced a cooperative agreement with ABB, an energy technology company. And Nissan has formed a joint venture called 4R Energy with the Sumitomo Corporation.
This month, researchers at the National Renewable Energy Laboratory, financed by the Department of Energy, announced their own initiative in this area, a collaboration with academic and industry partners.
From a technical perspective, a special area of focus for the laboratory’s research will be repurposing these batteries for Community Energy Storage systems on the electric utility grid, according to Jeremy Neubauer, a senior engineer in the lab’s energy storage group. If all goes as planned, in the smart grid of the future electric utilities would distribute thousands of these Community Energy Storage packs throughout the grid to help them manage power flow, especially during peak times or outages.
One pack would store 25 to 50 kilowatt hours of electricity, which could provide power for a few hours to four or five homes. Packs of this size would require stringing together two or three electric car batteries, and the compact size of these batteries lends itself to this purpose, Mr. Neubauer said. He also expects that using second-life batteries would be cheaper for the utilities than buying new ones.
But beyond the technical feasibility, what’s new about the lab’s research will be the focus on testing new financial and ownership models for the car batteries. Ahmad Pesaran, principal engineer on the lab’s study, said, “We want to prove the battery has value beyond its use in the car, and by creating business models, to realize this added value, ultimately lowering the cost of owning the car for the consumer.”
Wednesday, October 29, 2014
Where to invest Market for offshore Drilling Units booms


Tuesday, October 28, 2014
Morgan Stanley Backed Atlantis Targets India China for Tidal Power Plants
Atlantis Resources Corp., an ocean- current turbine maker backed by Morgan Stanley, plans to expand in China, India and South Korea after winning a bid in the U.K. to build the world’s largest tidal-power project.
Atlantis Resources may start building a 50 megawatt tidal farm by 2012 in Gujarat, a western Indian state, and conduct commercial-scale trials in South Korea, Timothy Cornelius, the chief executive officer, said an interview today.
“China’s the next big market for tidal energy,” Cornelius, 34, said in Singapore at the Clean Energy conference. “It has the most natural tidal resources in the world and can be home to more than 1,000 megawatts of tidal energy.”
Global production of electricity harnessing the ocean waves may climb ten-fold to as much as 300 megawatts in the next couple of years, said Cornelius, a former submersible engineer who splits his time between Singapore and London. The potential to produce marine power economically is about 24,000 megawatts, he said. It costs 2.5 million pounds ($4.01 million) per megawatt for a minimum 200 megawatt-tidal project, he said.
Saturday, October 18, 2014
Evidence for dark energy accumulates
However, it has been just over 10 years (since late 1997) that there has been strong evidence for the existence of dark energy. This evidence came from the observation of Type 1a supernovae. Such supernovae are expected on theoretical grounds to have roughly the same absolute brightness in all cases. This is because they result from the accumulation of hydrogen on the surface of white dwarf stars. This hydrogen is "stolen" by the white dwarf from a larger companion star, and as soon as a sufficient amount accumulates, a thermonuclear explosion occurs, destroying the white dwarf and producing a supernova.
Because all Type 1a supernovae should have approximately the same absolute brightness, it is possible to compare their observed brightness with what would be expected as a result of the absolute brightness and their estimated distance. The distance of a Type 1a supernova can be estimated from the redshift of its spectral lines, and assumptions about how fast the universe is expanding.
Up until 1997 it had generally been assumed that the universe was expanding, but at a slowly decreasing rate. However, what was determined in 1997 was that distant Type 1a supernovae had an observed brightness that was dimmer than would be expected on the assumption that the expansion of the universe was decelerating. Instead, the most natural assumption was that the expansion was accelerating, which would mean that the distant supernovae were farther away than expected, and hence dimmer.
There was a lot of uncertainty in the initial measurements of supernova brightness, as well as questions about the suitability of assumptions made in order to calculate the expected brightness. However, there were two other lines of evidence that supported the idea of a cosmological constant (and hence, dark energy).
One line of evidence was obtained from observations of the angular size of hot and cold spots in the cosmic microwave background (CMB) radiation. The actual size of these fluctuation can be calculated theoretically based on certain reasonable assumptions. However, the size that we observe depends on the curvature of the universe. For instance, if the curvature is positive, like a convex lens, then the angular size of the fluctuations will be magnified and appear larger than calculations predict. But it turns out that the observed size is very close to what is predicted, meaning that the universe must be nearly flat. And from other considerations, the universe can be "flat" only if there is a much higher energy density than can be accounted for in terms of all suspected types of matter, even dark matter. This extra energy density is best accounted for in terms of the dark energy.
A third line of evidence comes from the observed distribution of galaxies and galaxy clusters. The effect of dark energy to cause the expansion of the universe to accelerate also causes galaxies and clusters of galaxies to be spread farther apart than we would otherwise expect – and this additional spread is exactly what is observed.
However, the idea of dark energy, especially if it is based on a cosmological constant, is fairly radical, because we have no theoretical way to explain what dark energy is or why it should exist. Therefore, the more evidence we have that it does in fact exist the better.
So its quite welcome that a fourth line of evidence for the existence of dark energy is now much more strongly supported by data in a new study. The new evidence is based on more precise measurements of what is called the integrated Sachs-Wolfe effect. This effect is also found in observations of the CMB, but observations of a very different kind.
The effect is predicted to be manifested as microwave photons of the CMB pass through regions of the universe with densities that are higher or lower than the overall average. Consider a region of higher density, such as a supercluster of galaxies. As the photon enters the region, its energy will increase, because it is exchanging gravitational potential energy for electromagnetic energy, like a rock gains kinetic energy falling in Earths gravitational field. The photons energy gain is manifested in a shorter wavelength.
Galaxy superclusters are very large, from 100 to 500 million light-years in diameter. So in the time it takes a photon to cross a supercluster, the expansion of the universe will reduce the average matter density of the supercluster. The net effect is that the photon will lose less energy as it is leaving the supercluster than it gained when it entered. So the photon has a net energy gain in the process.
The universe also contains "supervoids", which are regions of size similar to superclusters where there are few galaxies, and the average matter density is less than the overall average. While a photon is passing through a supervoid, it will experience a net energy loss. On top of these energy gains and losses, a photon also gradually loses energy due to the expansion of the universe (as the photon wavelength gradually increases). There are still gains and losses after making allowance for this expansion effect. Moreover, the energy gains or losses are magnified if the expansion is accelerating.
The integrated Sachs-Wolfe effect is essentially these magnified energy gains and losses. The existence of this effect is a testable prediction of the existence of dark energy. Another way to think of the effect is as a measure of the extent that a supercluster or supervoid is expanding under the influence of dark energy, whereas there should be no expansion in the absence of dark energy. Importantly, this effect is independent of the brightness-distance relationship for Type 1a supernovae.
The new evidence for dark energy, then, is that very careful measurements of the energy of CMB photons in the directions of known superclusters and supervoids detect the existence of the integrated Sachs-Wolfe effect with very high probability, and hence another prediction based on the existence of dark energy is verified.
In the present study, about 3000 superclusters and 500 supervoids were initially selected from the Sloan Digital Sky Survey. This is out of around 10 million superclusters estimated to exist in the visible universe. Out of this sample, 50 superclusters and 50 supervoids having the largest density variation from the average were selected for closer examination.
The maximum distance of a chosen cluster was a redshift of about .5, corresponding to a distance of about 5 billion light-years. Because of the huge size of a supercluster, a typical supercluster would have an angular diameter, as seen from Earth, of about 1/25 of full circle, or 14 degrees. The researchers decided to consider circles of angular radius 4 degrees around the center of a cluster as containing the bulk of the cluster. Such circles are still about 16 times the diameter of the full Moon (1/2 angular degree).
Within each circle, the average temperature of CMB photons was measured, and compared to the overall average. The variations were very small – about 10-5K, compared to average CMB photon temperature of 2.73K – about 3 parts in a million. Nevertheless, the measurements were accurate enough that the probability of this variation being measured by chance is only about 1 in 200,000.
This is not the first research effort that has produced evidence for the integrated Sachs-Wolfe effect. However, it is based on cleaner data, and has the lowest probability of falsely showing an effect based only on chance.
News articles:
- Most Direct Evidence of Dark Energy Detected (8/11/08)
- The most direct signal of dark energy? (8/8/08)
- Supervoids and clusters reveal dark energy (8/7/08)
- Dark Energys Early Fingerprints (8/6/08)
- Dark Energys Fingerprint Found in Distant Galaxies (8/5/08)
- Dark Energy Signs Seen in Giant Clusters and Voids (8/4/08)
- UH team sees ‘dark energy’ trail (8/4/08)
- Caught in the Act: Dark Energy Expanding the Universe (8/4/08)
- Unmasking Dark Energy (8/1/08)
- Scientists Find Direct Evidence of “Dark Energy” in Supervoids and Superclusters (7/31/08)
- Dark energy 'imaged' in best detail yet (5/23/08)
Further reading:
Supervoids and Superclusters – Web pages produced by the research team, with illustrations and background information
An Imprint of Super-Structures on the Microwave Background due to the Integrated Sachs-Wolfe Effect – short technical paper describing the research
Dark Energy Detected with Supervoids and Superclusters – longer, more leisurely presentation of the research, by the research team
Tags: dark energy, Sachs-Wolfe effect
Wednesday, October 15, 2014
Renewables vs nuclear energy What is better for climate change
Renewable energy certainly seems like the better solution than building more nuclear power plants, and this is not just because of the recent Fukushima accident. The accidents such as Fukushima and Chernobyl are rare but when they occur they are usually accompanied by massive environmental damage which is usually long-lasting, and difficult to clean up.
Nuclear power plants are extremely expensive to be built because they need to comply with number of different safety measures and also because they are technologically complex. Even choosing site for nuclear power station is very difficult because communities usually oppose having plant nearbye. Renewable energy technologies have been constantly dropping in prices, and its only matter of time before wind and solar become cost-competitive with fossil fuels, in fact if you calculate the total damage in environmental, social and health costs due to climate change and pollution then renewable energy is already better in terms of costs than fossil fuels.
The technologies used for nuclear power generation could be also used for the development of nuclear weaponry, and we must also not discount the possibility of terrorist attack, just imagine what could happen if some radical terrorist organization would take over the nuclear power plant.
Clean energy race is well on, and all countries of the world have been seriously considering their renewable energy options, in order to choose the one best suited for them. In many countries future nuclear power development has been pretty much abandoned and the golden age of nuclear power generation seems to be well behind us.
In the last 10-15 years, from 2000 upwards global renewable energy capacity has more than doubled. In 2012, in United States, renewable energy accounted for 56% of new electricity generation.
It would be wrong to say that we should abandon nuclear energy straight away because nuclear energy accounts to significant share of electricity generation in many countries of the world. The solution is to focus primarily on renewable energy sources such as solar and wind when discussing our energy future. Nuclear power had a pretty good run, and once current nuclear power plants end their lifetime we should consider replacing them with some of various renewable energy solutions.
Friday, September 26, 2014
LEDs will slash energy use for lighting by 95
A simple (but not perfect) measure for lighting efficiency is the number of lumens (a measure of light intensity) a lighting source produces per watt. A conventional incandescent bulb gets 13 lumens per watt to light your room, while a replacement LED bulb from Philips that can be bought at Coles or Woolworths achieves 80 lumens per watt (a compact fluorescent globe gets about 60 lumens per watt).CREE (the industry leader who, it is speculated, may purchase the next best, Philips’ Lumileds division) has successfully demonstrated Light Emitting Diodes running at 300 Lumens per watt in the lab. CREE currently sell a $10, 9.5W bulb (available in the US), which produces 85 Lumens per watt and can directly replace an old style 60W globe.
Other breakthroughs and innovations are contributing to achieving higher efficiency’s in LED lighting, including a breakthrough by German researchers which will not only effect LED lights, but laptop and mobile phone chargers, cutting losses in today’s most efficient power supplies by half from 10% to just 5%.
Taking all this into consideration, according to the US Department of Energy SSL (Solid State Lighting) program http://energy.gov/eere/ssl/solid-state-lighting we should be able to achieve wall plug efficiencies of 250 Lumens per watt by 2020 which means that a conventional bulb replacement in 2020 would be available using only a third of the electricity of today’s LED bulbs.
At that staggering rate of 250 lumens per watt, it will only take 3W to light a room, when it used to be done with 60 Watts of power. This represents a 95% reduction in energy required for lighting.
This will have a profound effect on the world’s requirement for lighting energy. We can expect - on an absolute basis – that 19% of the world’s electricity which is currently used for lighting to dramatically drop by at least 75%. On today’s numbers the reduction is the equivalent of the entire electricity consumption of the European Union.
In developed nations these huge efficiency gains from LEDs in the lighting sector will contribute to the continuing restructure of the electricity supply industry, which is currently facing a death spiral unless it can electrify the remaining residential energy services coming from fossil gas and supply a fast tracked electrification of the world’s vehicle fleet.
In developing countries, rooms that can be lit with 3W and task lights with even lower electricity consumption. This means that almost all the remaining 1.5Billion of the world’s population without an electricity supply will be able to access one at very minimal marginal cost in the next 5 years.
Monday, September 15, 2014
The Battle for Power on the Internet
We’re in the middle of an epic battle for power in cyberspace. On one side are the traditional, organized, institutional powers such as governments and large multinational corporations. On the other are the distributed and nimble: grassroots movements, dissident groups, hackers, and criminals. Initially, the Internet empowered the second side. It gave them a place to coordinate and communicate efficiently, and made them seem unbeatable. But now, the more traditional institutional powers are winning, and winning big. How these two sides fare in the long term, and the fate of the rest of us who don’t fall into either group, is an open question—and one vitally important to the future of the Internet.In the Internet’s early days, there was a lot of talk about its “natural laws”—how it would upend traditional power blocks, empower the masses, and spread freedom throughout the world. The international nature of the Internet circumvented national laws. Anonymity was easy. Censorship was impossible. Police were clueless about cybercrime. And bigger changes seemed inevitable. Digital cash would undermine national sovereignty. Citizen journalism would topple traditional media, corporate PR, and political parties. Easy digital copying would destroy the traditional movie and music industries. Web marketing would allow even the smallest companies to compete against corporate giants. It really would be a new world order.
This was a utopian vision, but some of it did come to pass. Internet marketing has transformed commerce. The entertainment industries have been transformed by things like MySpace and YouTube, and are now more open to outsiders. Mass media has changed dramatically, and some of the most influential people in the media have come from the blogging world. There are new ways to organize politically and run elections. Crowdfunding has made tens of thousands of projects possible to finance, and crowdsourcing made more types of projects possible. Facebook and Twitter really did help topple governments.
But that is just one side of the Internet’s disruptive character. The Internet has emboldened traditional power as well.
On the corporate side, power is consolidating, a result of two current trends in computing. First, the rise of cloud computing means that we no longer have control of our data. Our e-mail, photos, calendars, address books, messages, and documents are on servers belonging to Google, Apple, Microsoft, Facebook, and so on. And second, we are increasingly accessing our data using devices that we have much less control over: iPhones, iPads, Android phones, Kindles, ChromeBooks, and so on. Unlike traditional operating systems, those devices are controlled much more tightly by the vendors, who limit what software can run, what they can do, how they’re updated, and so on. Even Windows 8 and Apple’s Mountain Lion operating system are heading in the direction of more vendor control.
I have previously characterized this model of computing as “feudal.” Users pledge their allegiance to more powerful companies who, in turn, promise to protect them from both sysadmin duties and security threats. It’s a metaphor that’s rich in history and in fiction, and a model that’s increasingly permeating computing today.
The Washington post reports on the latest Snowden NSA revelations, this time about backdoors into big internet companies like Google and Yahoo not be considered sufficient so theyve worked out the weak points in the company internal networks as well - SSL Added and Removed Here! :).
The National Security Agency has secretly broken into the main communications links that connect Yahoo and Google data centers around the world, according to documents obtained from former NSA contractor Edward Snowden and interviews with knowledgeable officials.By tapping those links, the agency has positioned itself to collect at will from hundreds of millions of user accounts, many of them belonging to Americans. The NSA does not keep everything it collects, but it keeps a lot.
According to a top-secret accounting dated Jan. 9, 2013, the NSA’s acquisitions directorate sends millions of records every day from Yahoo and Google internal networks to data warehouses at the agency’s headquarters at Fort Meade, Md. In the preceding 30 days, the report said, field collectors had processed and sent back 181,280,466 new records — including “metadata,” which would indicate who sent or received e-mails and when, as well as content such as text, audio and video.
The NSA’s principal tool to exploit the data links is a project called MUSCULAR, operated jointly with the agency’s British counterpart, the Government Communications Headquarters . From undisclosed interception points, the NSA and the GCHQ are copying entire data flows across fiber-optic cables that carry information between the data centers of the Silicon Valley giants.
The infiltration is especially striking because the NSA, under a separate program known as PRISM, has front-door access to Google and Yahoo user accounts through a court-approved process.
The MUSCULAR project appears to be an unusually aggressive use of NSA tradecraft against flagship American companies. The agency is built for high-tech spying, with a wide range of digital tools, but it has not been known to use them routinely against U.S. companies.
Friday, September 12, 2014
How to develop a business plan for oil depletion
The world currently finds itself in the position of a man standing in a road who has just noticed two large trucks bearing down on him. These metaphorical trucks are labelled Peak Oil and Global Warming. However, despite increasing evidence and clearer definitions of the risks, collectively we have been remarkably reluctant to move out of the path of the oncoming trucks.
This article will only look at Peak Oil, arguably the most imminent threat to our collective welfare. The general reluctance to act and invest appears to stem from the fact that Peak Oil seems an improbable event, given that oil production and its use has expanded steadily for the last 150 years; and that to do anything about it will be expensive and disruptive to our way of life. A dangerously complacent view that is, unfortunately, widely held.
Peak Oil is often described rather narrowly as running out of oil. This is both misleading and inaccurate. Oil is not running out, but the ability to provide all the oil that we might want at a reasonable price is disappearing. In many countries physical exhaustion of the reserves is already happening. The North Sea is a good example. Oil production in the UK sector of the North Sea will average around 1.3 million barrels/day (b/d) in 2011 or just 45 per cent of its 1999 peak of 2.9 million b/d.
Around 25 large-scale oil producers and up to 40 small ones are already in sustained decline. With roughly half of global production coming from countries where production capacity is falling (depletion) this means the remaining producers need to work ever harder to increase output to offset the losses from those in decline and to meet increasing demand. This is a situation that deteriorates with every additional country that goes into production decline.
Physical exhaustion is only one way that the world can be deprived of the oil production flows it would like. Other threats are:
* Physical constraints – if rebels blow up pipelines or there are wars or revolutions, eg Libya, Nigeria, etc.
* Financial constraints – where the money is either not available or the host government doesnt allow companies with the money to invest, eg Venezuela, Mexico, etc.
* Political constraints – where a political decision not to produce or not to export is made. All Opec quotas are political constraints, while recent statements by both Russia and Saudi Arabia that they may cap capacity at current levels would become major constraints if literally implemented.
Possibly the most important constraint of all is price. Oil came to dominate our societies because it was both plentiful and cheap. It is now expensive and its supply is becoming constrained. In mid-2008, the world found out the hard way that it could not afford high-price oil. Or to be more accurate, the Atlantic basin economies of Europe and North America found they could not afford high-price oil. The rest of the world by a combination of fuel subsidies and dynamic economies were rather less affected.
When the level of oil prices and oil production are compared it can be shown that from 2000 to 2003 prices were steady at around $25/barrel and that production responded by growing – meaning that additional supply was forthcoming without prices rising. From 2004, however, prices started rising steadily, but supply stopped rising from early 2005. This means prices had to rise further to reconcile demand growth with static supply until the price boom-and-bust cycle of 2008 initiated the Great Recession (with a little help from the bankers).
As the economy recovered so did oil prices as the supply response remained minimal. The geopolitical upheaval of the Arab Spring and the Libyan conflict appeared on the point of initiating another boom-and-bust cycle but, for the moment, prices have fallen back and oil output appears adequate.
Looking forward, there is little or no chance of enough reasonably low-cost oil being found and developed to alter the pattern of tightening supply and rising prices, interspersed with periodic busts as high oil prices undermine economic growth. All the indications are that by around 2013 there will be no Opec spare capacity to turn on, insufficient new flows to meet demand and prices will be soaring. In short, Peak Oil will have arrived when the flow of new capacity will be insufficient to offset the loss of capacity to depletion.
Over the last two to three years, the link between oil prices and GDP growth has moved from the economic fringes to the economic mainstream. It is now widely accepted that high oil prices inhibit growth and very high prices will trigger a recession, although the speed of the rise may be as or more important than the absolute level. There is effectively no time for adaptation in the face of a very rapid price rise.
Saturday, September 6, 2014
Historic Day for Tidal Energy in the US
Maine regulators have directed three utilities to buy 4 megawatts (MW) of tidal electricity from Ocean Renewable Power Company, making it the first state to commercialize ocean energy.Installation of the first unit began in March and in Cobscook Bay and will be finished by late summer, feeding electricity to the grid by October 1.
In fall 2013, the company will add four more devices with a total capacity of 900 kilowatts, enough to power about 100 homes.
The 4 MW project will suppy electricity for over 1000 homes by 2016.
The Maine Public Utilities Commission (PUC) approved a term sheet for the nations first power purchase contract for tidal energy, to be in place for 20 years.
The term sheet sets the price to be paid for tidal power at 21.5 cents per kilowatt hour, much higher than typical rates of 11-12 cents. The rate will rise 2% a year and makes the project feasible.
In making the decision, regulators looked at what the cost of fossil fuels would be over 20 years and decided they would likely be even higher. In fact, they see tidal energy being cost-competitive in as little as five years.
The International Energy Agencys International Vision for Ocean Energy sets a goal for the technology to be cost-competitive by 2020.
Thursday, September 4, 2014
Perth company seeks 4m for new wave energy technology
Perth-based renewable energy company Bombora Wave Power Australia has launched its first round of capital raising, to help fund the next phase of development of its award winning wave energy technology. The company, run by WA brothers Shawn and Glen Ryan, is hoping to raise $4 million towards the next two years of development of its home-grown Wave Energy Converter (WEC) technology, which has so far been tank tested and cleared for technology readiness.The WEC technology uses a unique ramp-like feature to capture both heave and surge motions within a wave to extract more of its energy. The (patent pending) design impedes the wave’s forward motion, forcing it to rise higher, accentuating the forces acting on the power capture elements of the device. It also restricts flow back over the structure during a wave trough, lowering the wave depth and emphasising the effective height variation of the wave as it passes.
Wednesday, September 3, 2014
Using Cogeneration For Heating Swimming Pools
A 100kW cogeneration plant has been installed (July 2013) at the North Sydney Olympic Pool by the contractor Urban Energy. The plant, which will be powered by natural gas, will reduce CO2 emissions by 367 tonnes per annum and contribute to achieving Council’s sustainability targets.The plant will produce more than 450,000kWh of electricity per annum, saving $58,000 on the Pool’s power bill. The Olympic Pool facility accounts for 35% of Council’s total electricity use, with an annual consumption of more than 1,450,000kWh.
Cogeneration technology, or cogen as it is usually called, is the process of using a heat engine to simultaneously generate electricity and heat - both essential for the operation of the Olympic Pool. The new system will heat the outdoor pool at a pleasant 25 degrees all year round and maintain the indoor pool at a steamy 29 degrees.