Sep 19, 2014
In what is becoming a macabre "Comedy of Errors," TEPCO officials are again beset with problems at the Fukushima Daiichi power plant. This time, they are blaming the accidental dumping of 200 tons of radioactive water into the wrong place on four pumps. The blame for the accidental dumping of 200 tons of highly radioactive water into a group of buildings used to house the central waste processing facilities last Friday is being blamed on four pumps that were not supposed to be used, but were mysteriously "turned on" by a party or parties unknown. What is really interesting about this story is the fact that the accident was first noticed on Thursday, April 10, but the pumps were not turned off until two days later, on April 13. Recreating the incident, TEPCO officials said the water used for cooling down the molten reactors becomes highly radioactive. The water was supposed to be directed to a storage building used for that purpose, where it is then transferred to a facility for decontamination at a later date. But on April 10, workers noticed that the water level in the buildings that should have been having water pumped out of them was going up, instead of down. Two days later, on April 12, it was finally discovered that the four pumps, supposedly not being used were turned on. Right here, there should be an investigation into why the pumps were still on-line, especially if they were not in use. All four pumps were turned off around 5 p.m. of the 13th. By that time, 200 tons of radioactive water had flooded the wrong buildings. TEPCO reported the incident to the Nuclear Regulatory Authority at that time. The NRA instructed TEPCO to start monitoring the situation so that no leakage would escape the buildings or the facility. Just to make things even more interesting, TEPCO reported another, separate incident on April 13, when one ton of treated radioactive water escaped a leaking storage tank. TEPCO officials said none of the water escaped into the sea. Local authorities have repeatedly asked that TEPCO pinpoint the causes for the numerous problems at the disabled power plant, but TEPCO officials have said very little about any of the problems. In February, over 100 tons of contaminated water leaked from a storage tank due to a deliberately opened valve. According to TEPCO officials, over 100 workers have been interviewed, and so far, no one knows exactly what happened.
Mother Jones: On Tuesday, the White House officially announced that it would be sending US troops to Liberia to fight the Ebola outbreak. The military has already requested to use $500 million from its Overseas Contingency Operations budget to deal with Ebola in West Africa and ISIS in Iraq, and plans to request another $500 million to combat the epidemic, which United Nations officials have said is needed to keep the number of cases in the "tens of thousands." (So far, the World Health Organization is aware of about 5,000 people who it believes have been infected in Liberia, Guinea, and Sierra Leone, although it says the actual toll is probably much higher.)
More MoJo coverage of the Ebola crisis.
But as the military heads to Liberia, concerns over the region's stability—and to what degree US troops will be involved in maintaining it—still hover over the entire operation.
The core of the military's 3,000-troop mission in Liberia will be medical—building treatment centers and training medical staff by the hundreds to run them. But the outbreak and resulting panic have caused other problems, some of which the military will deal with, and others that they may try to avoid. One major problem is a food crisis: Liberia imports about two-thirds of its grain supply, but as its neighbors have closed their borders to prevent the disease from spreading and shipping into the country has slowed, food has become scarcer, and prices have increased. To ease this situation, the US operation will help to distribute food aid in the country.
Please read full and follow at: Mother Jones
"We're now in a position where we're supplying Burlington residents with sources that are renewable," said Ken Nolan, manager of power resources for Burlington Electric Department, earlier this month. "The prices are not tied to fossil fuels — they're stable prices — and they provide us with the flexibility, from an environmental standpoint, to really react to any regulation or changes to environmental standards that come in the future."
According to Nolan, the utility will get about one-third of its power from the Joseph C. McNeil Generating Station, one-third from wind energy contracts, and one-third from the hydroelectric stations Winooski One and Hydro-Québec. The McNeil power station is a biomass facility that primarily uses wood chips from logging residue leftover from the harvesting of wood for other products.
Sep 18, 2014
Gov. Jerry Brown signed legislation on Tuesday overhauling the state's management of its groundwater supply, bringing it in line with other states that have long regulated their wells.
Groundwater makes up nearly 60 percent of California's water use during dry years. But it is not monitored and managed the same way as water from reservoirs and rivers.
Supporters of the legislation say the worst drought in a generation inspired them to rethink the state's hands-off approach to tapping wells, which has led to sinking land and billions of dollars in damage to aquifers, roads and canals.
'This is a big deal,' Brown said at the signing ceremony in his office. 'It has been known about for decades that underground water has to be managed and regulated in some way.'
The package signed into law requires some local governments and water districts to begin managing their wells, and it authorizes state water agencies to intervene if necessary. It also allows for water metering and fines to monitor and enforce restrictions.
SB1168, SB1319 and AB1739 by Assemblyman Roger Dickinson, D-Sacramento, and Sen. Fran Pavley, D-Agoura Hills, passed in the final days of the legislative session over objections from Republican lawmakers and Central Valley Democrats.
The opposition was driven by agricultural interests that are increasingly dependent on pumping from wells as reservoirs dry up and government water allocations plunge in the drought. They say the legislation was rushed and punishes well-managed agencies while infringing on property rights.
'While there is legitimate concern about the over-drafting of some groundwater basins, this massive expansion of state authority will not solve the problem,' said Assembly Minority Leader Connie Conway, R-Tulare.
Brown said in a signing message he would push for legislation next year to streamline resolutions in disputes over groundwater rights.
Unlike other states that treat groundwater as a shared resource, California property owners have been entitled to tap water beneath their land since the Gold Rush days.
Please read on at:
Around 80 percent of all antibiotics used in the U.S. are given to livestock, Grow and Huffstutter write. "About 390 medications containing antibiotics have been approved to treat illness, stave off disease and promote growth in farm animals. But the U.S. Food and Drug Administration has reviewed just 7 percent of those drugs for their likelihood of creating antibiotic-resistant superbugs, a Reuters data analysis found."
Reuters reviewed more than 320 feed tickets from six major poultry companies during the past two years, finding that in every instance "the doses were at the low levels that scientists say are especially conducive to the growth of so-called superbugs, bacteria that gain resistance to conventional medicines used to treat people," Grow and Huffstutter write.
Two producers, George's and Koch Foods, "have administered drugs belonging to the same classes of antibiotics used to treat infections in humans," Grow and Huffstutter write. "The practice is legal. But many medical scientists deem it particularly dangerous because it runs the risk of promoting superbugs that can defeat the life-saving human antibiotics."
The poultry industry said the antibiotics pose little threat to humans, Grow and Huffstutter write. Tom Super, spokesman for the National Chicken Council, told Reuters, "Several scientific, peer reviewed risk assessments demonstrate that resistance emerging in animals and transferring to humans does not happen in measurable amounts, if at all." He said using antibiotics to prevent diseases in flocks "is good, prudent veterinary medicine. . . . Prevention of the disease prevents unnecessary suffering and prevents the overuse of potentially medically important antibiotics in treatment of sick birds."
Health authorities disagree. The "World Health Organization called antibiotic resistance 'a problem so serious it threatens the achievements of modern medicine,'" Reuters writes. The annual cost to battle antibiotic-resistant infections is estimated at $21 billion to $34 billion in the U.S., WHO said. Each year, about 430,000 people in the U.S. become ill from food-borne bacteria that resist conventional antibiotics, according to a July report by the Centers for Disease Control and Prevention. Overall, the CDC estimates that 2 million people are sickened in the U.S. annually with infections resistant to antibiotics and at least 23,000 people die. (Read more)
reports for The Wall Street Journal. "A magnitude 5.3 quake in August 2011 occurred near two injection wells. The wells were just a few miles from the site of the quake and were injecting 'more than 400,000 barrels of wastewater per month' in the 16 months before the quake, the study said." (WSJ graphic)
Researchers said that since mid-2000 the total injection rate of the 21 high-volume wastewater disposal wells in Colorado and seven in New Mexico, have ranged from 1.5 to 3.6 million barrels per month," Daniel Wallis reports for Reuters. "They said the timing and location of seismic events correspond to the documented pattern of injected wastewater and that their findings suggest seismic events are initiated shortly after an increase in injection rates." (Read more)
Tags: Battery, Electricity, Energy, Nuclear, Prototype, University of Missouri
Sep 17, 2014
US tech giant Google has signed up for its 17th renewable energy project, to put $145 million towards an 82 megawatt solar project being built by SunEdison on a former oil and gas field. The deal, announced last week, puts the internet search company's clean energy investment tab at more than $1.5 billion, for projects spanning three continents and totalling a capacity of more than 2.5GW.
This latest, the "Regulus" solar project in Google's home state of California, will be SunEdison's largest in North America once finished, comprising more than 248,000 monocrystalline solar PV panels spanning 737 acres.Solar-google-logo "Over the years, this particular site in California has gone from 30 oil wells to five as it was exhausted of profitable fossil fuel reserves," writes Google's renewable energy principal Nick Coons on Google's blog. "The land sat for some time and today we're ready to spiff things up."
Please read full and follow at: Peak Energy
House Passes Waters of U.S. Regulatory Overreach Bill. EPA - proposed rule clarifies protection for streams and wetlands.
H.R. 5078 requires the EPA and the Corps to revisit the proposed rule with direct consultation with state and local officials to determine which bodies of water should be covered under the Clean Water Act.
EPA and the Army Corps of Engineers jointly released a proposed rule on April 21, “Definition of Waters of the United States Under the Clean Water Act,” designed to clarify which waterways are subject to the Clean Water Act (CWA) discharge permitting requirements. Because the CWA affects many aspects of federal and state regulation, some are calling this proposed rule the most significant CWA development in years.
EPA has stated that the proposed rule does not protect any new types of waters that have not been covered under the CWA, and will reduce confusion about CWA protection since the landscape has been complex following Supreme Court decisions in 2001 and 2006. EPA issued a press release stating that the proposed rule will benefit businesses by increasing efficiency in determining coverage of the CWA.
According to EPA, the proposed rule clarifies protection for streams and wetlands. “The proposed definitions of waters will apply to all Clean Water Act programs. It does not protect any new types of waters that have not historically been covered under the Clean Water Act and is consistent with the Supreme Court’s more narrow reading of Clean Water Act jurisdiction,” EPA stated in a press release.
The also stated that the proposed rule clarifies that under the CWA and based on science:
[*] Most seasonal and rain-dependent streams are protected;
[*] Wetlands near rivers and streams are protected;
[*] Other types of waters may have more uncertain connections with downstream water and protection will be evaluated through a case-specific analysis of whether the connection is or is not significant.
This new category known as “other waters”— waters that do not fit into any predefined categories — essentially gives EPA the authority to determine that a body of water falls within the “waters of the U.S.” jurisdiction if it shows, either alone or in combination with other “similarly situated” waters in the region, that the water has a “significant nexus” to traditional navigable waters. The proposed rule also creates definitions for “tributary” and “neighboring waters.” Furthermore, the definitions now include adjacent wetlands and “waters,” which were formerly just adjacent wetlands.
Many industries, as well as members of Congress, have criticized that this proposed rule does not simply clarify CWA jurisdiction, but that it is effectively an expansion of CWA jurisdiction and would expand federal authority over streams, ditches, ponds, and other local water bodies, and they have also expressed concerns with the ambiguity of some of the proposed definitions. In particular, the fact that the proposal would allow for EPA to define, on a case-by-case basis, any waters as being within their jurisdiction has created serious concerns among industry. Industries have also indicated that the proposal would deviate from the spirit of current law that applies specifically to truly “navigable waters.”
ACA and its Environmental Management Committee are closely monitoring the proposed rule and its potential impact on our industry, particularly issues relating to storm water, wastewater, and EPA’s Spill Prevention, Control, and Countermeasure (SPCC) Rule.
Together with CEPE (the European coatings and printing ink association), the International Paint and Printing Ink Council (IPPIC) has a procedure for Efficacy Evaluation of Antifouling Products that is annexed with the new ECHA guidance. However, the degree of rigor and amount of data asked for in the EU guidance goes beyond the recommendations in this industry methodology.
The coatings industry — in particular, the marine coatings sector — has worked closely with the EU for several years to initiate and finalize this ECHA guidance. Efficacy testing is part of product development processes that can take years; with ECHA’s new guidance, industry will know exactly what data is required under the BPR. As such, industry now has to be able to design test procedures in accordance with what has to be provided in the various EU countries.
Of note, antifouling products will not be evaluated under the BPR at the EU level, but rather, by Evaluating Member States chosen by each applicant.
The new ECHA guidance is posted on the organization’s website at
Please continue reading from: 100% Of Power For Vermont City Now Renewable
IEEE Spectrum has a look at a new OTEC pilot plant in Hawaii - Ocean Thermal Energy: Back From the Deep.
The finished facility, which will be only a bit higher in capacity than existing test plants in Japan and South Korea, is quite modest by energy production standards. The plant will be able to produce at most 100 kilowatts of power—enough, when operating continuously, to supply electricity to about 80 average American homes.
The plant and its pumps will consume most of the energy produced. But Makai's plant is geared toward research, Eldred notes, not energy generation. The facility was built primarily to design and test heat exchangers, which are among the most expensive components of an OTEC plant. With the addition of a turbine, Eldred says, Makai will be able to design an automatic control system and improve both performance and cost predictions for its commercial plant designs. The company also hopes to get a sense of how fluctuations in the temperature and pressure of ocean water will alter power output, a factor that might prove significant for wave-tossed offshore plants.
That's likely to be where OTEC energy production winds up. A 10-megawatt plant, such as one that Lockheed Martin aims to build for China's Reignwood Group, will require a cold-water pipe that is several meters wide. A plant floating in open water could send a pipe straight to the depth required instead of diagonally, down a long slope extending out from shore. That would make for a shorter and less expensive pipe, reduce the impact on the landscape, and cut down on the energy required to pump the cold water.
The first large-scale plant to make the leap could be New Energy for Martinique and Overseas (NEMO), says Luis Vega of the University of Hawaii's National Marine Renewable Energy Center. The project, which is a collaboration between renewable energy firm Akuo Energy and naval defense company DCNS, both based in Paris, plans to construct a 16-MW plant about 5 kilometers off the shore of the island of Martinique.
Construction is set to start next year, and the team aims to have the plant operational in four years. When complete, NEMO should be able to supply some 11 MW of energy to the Caribbean island, with the other 5 MW powering the plant and its pumps.
Please read full and follow at:Peak Energy
Sep 15, 2014
Public Health Disaster Research Response
September 19, 2014, 1:00-2:15 p.m. EDT
To participate, please see Webinar Information at:
Description: Public health emergencies pose many adverse health effects for local communities and the first responders. However, there is a recognized knowledge gap regarding the environmental exposures during the disaster and the potential health outcomes. The recent call to action, "Research as Part of Public Health Emergency Response," authored by the NIH Director, the HHS Assistant Secretary for Preparedness and Response, and the CDC Director, voiced the critical need for well-designed, effectively executed research to address these pressing knowledge gaps. These public health emergencies present challenges to the research community because of their unpredictable onset as well as their health and environmental effects. Currently, human subjects research in the period immediately following disasters is hampered by the time needed to design protocols and implement data collection, so the opportunity to acquire crucial early epidemiologic, medical, and environmental data and samples is usually missed. The NIEHS, HHS, other federal agencies, and the academic community are working to address this need.
In this webinar, we will hear about the current Public Health Disaster Research Response (DR2) and Science Preparedness efforts to help respond to the need for timely research.
Anthony Barone, M.P.H.
Senior Management Analyst - Emergency Management, GAP Solutions, Inc.
Assigned to the Office of the Assistant Secretary for Preparedness and Response (ASPR)
Aubrey Miller, M.D., M.P.H.
Senior Medical Advisor, National Institute of Environmental Health Sciences
Sharon Petronella Croisant, Ph.D.
Associate Professor, University of Texas Medical Branch at Galveston
We look forward to your participation!
Sidecar revealed Thursday that it received the letter from the California Public Utilities Commission, which said the ride-sharing service was breaking the law by testing its new Shared Rides, or carpool, feature.
A Lyft spokesperson told CNET that it too received a similar letter. Initially, Uber told CNET that it didn't get the letter but now the company says it was indeed contacted by the CPUC. The CPUC also confirmed with CNET that it sent two copies of the letter to Uber -- one to company CEO Travis Kalanick and one to Chairman Garrett Camp -- on September 8.
"Uber recently announced its intent to offer a new transportation service known as UberPool," the CPUC letter reads. "Uber has not yet approached the Commission regarding the UberPool service... Uber's proposed transportation service violates existing California law."
Please read full and follow at: http://www.cnet.com/news/california-deems-all-ride-share-carpooling-services-illegal/
Royal Dutch Shell has teamed with a sovereign investment fund from Oman to invest $53 million in a company that manufactures solar power equipment designed for increasing oil production. Glasspoint Solar Inc. installs aluminum mirrors near oil fields that concentrate solar radiation on insulated tubes containing water. The steam generated from heating the water is injected into oil fields to recover heavy crude oil. This concept of enhanced oil recovery. involves high pressure injection of hot fluids to recover heavy crude oil. The use of renewable energy like solar power makes great economic sense, as the fuel cost associated with this enhanced oil recovery technology is practically zero. Shell hopes to employ this technology in its oil fields in Oman. The company hopes to reduce greenhouse gas emissions associated with enhanced oil recovery operations. A large-scale successful implementation of this technology could be a game changer for major consumers like India and the U.S.. Both have substantial oil reserves, but are unable to tap them due to high costs involved in heavy oil recovery.
Read more of this story at Slashdot.
Other researchers have found that the impact is greater at the lowest levels. The world would have hundreds of thousands more geniuses were it not for the effects of lead.
According to recent research, levels of airborne heavy metal particles are 10-20 times higher on average in China than in the US.
Hong Kong and China have levels of 10 micrograms - double the 5 microgram US pollution standard.Governments around the world are slowly tightening legislation regulating air polluters, but enforcement lags regulation and is not enough to keep pace with the increase in pollution generated by economic growth
Read more »// Next Big Future
Sep 13, 2014
Researchers develop novel supercapacitor architecture that provides two times more energy and power compared to supercapacitors commercially available today
Improved Supercapacitors for Super Batteries, Electric Vehicles
(www.ucr.edu) — Researchers at the University of California, Riverside have developed a novel nanometer scale ruthenium oxide anchored nanocarbon graphene foam architecture that improves the performance of supercapacitors, a development that could mean faster acceleration in electric vehicles and longer battery life in portable electronics.
The researchers found that supercapacitors, an energy storage device like batteries and fuel cells, based on transition metal oxide modified nanocarbon graphene foam electrode could work safely in aqueous electrolyte and deliver two times more energy and power compared to supercapacitors commercially available today.
The foam electrode was successfully cycled over 8,000 times with no fading in performance. The findings were outlined in a recently published paper, "Hydrous Ruthenium Oxide Nanoparticles Anchored to Graphene and Carbon Nanotube Hybrid Foam for Supercapacitors," in the journal Nature Scientific Reports.
The paper was written by graduate student Wei Wang; Cengiz S. Ozkan, a mechanical engineering professor at UC Riverside's Bourns College of Engineering; Mihrimah Ozkan, an electrical engineering professor; Francisco Zaera, a chemistry professor; Ilkeun Lee, a researcher in Zaera's lab; and other graduate students Shirui Guo, Kazi Ahmed and Zachary Favors.
Supercapacitors (also known as ultracapacitors) have garnered substantial attention in recent years because of their ultra-high charge and discharge rate, excellent stability, long cycle life and very high power density.
Please continue reading from:
Pesticides in U.S. Streams and Rivers: Occurrence and Trends during 1992–2011 - Environmental Science & Technology (ACS Publications)
During the 20 years from 1992 to 2011, pesticides were found at concentrations that exceeded aquatic-life benchmarks in many rivers and streams that drain agricultural, urban, and mixed-land use watersheds. Overall, the proportions of assessed streams with one or more pesticides that exceeded an aquatic-life benchmark were very similar between the two decades for agricultural (69% during 1992−2001 compared to 61% during 2002−2011) and mixed-land-use streams (45% compared to 46%). Urban streams, in contrast, increased from 53% during 1992−2011 to 90% during 2002−2011, largely because of fipronil and dichlorvos. The potential for adverse effects on aquatic life is likely greater than these results indicate because potentially important pesticide compounds were not included in the assessment. Human-health benchmarks were much less frequently exceeded, and during 2002−2011, only one agricultural stream and no urban or mixed-land-use streams exceeded human-health benchmarks for any of the measured pesticides. Widespread trends in pesticide concentrations, some downward and some upward, occurred in response to shifts in use patterns primarily driven by regulatory changes and introductions of new pesticides.
Please continue reading from:
Sep 12, 2014
Sustainablog; What do you do with your coffee waste? You may add the grounds to your compost bin, or put them directly onto your roses and other plants as a fertilizer. If you're in the business of coffee processing, though, you're dealing with much more waste… particularly wastewater. According to Dutch sustainable agriculture non-profit UTZ Certified, every cup of coffee we drink requires 140 litres (nearly 40 gallons) of water to process. After that water's used, it's full of organic material. If it simply gets discharged into other water sources, as it often does in the developing world, it's not only polluting water and putting wildlife and people at risk, but it's also creating methane emissions.
Not exactly the kind of guilt you want to deal with first thing in the morning, right? Of course, the people living near these facilities don't want their water dirtied up, either. UTZ Certified has been experimenting with a system that not only cleans up the water, but turns the methane into biogas that can be used by the facility, local farmers, and other nearby residents. After four years of testing in nineteen pilot sites around Central America, the organization has released its results… and they look good. Among the findings:
- The wastewater treatment was able to dispose of 80-90% of water contamination;
- Biogas production helped reduce firewood use for local communities, or, in the case of the largest facilities, power portions of the processing machinery;
- Carbon emissions were decreased by 8-9 tons per community because of the shift away from firewood.
Not bad for initial efforts to deal with these environmental issues. The experiment was als0 able to cut the amount of water needed for processing in half.
Needless to say, UTZ Certified feels pretty good about these findings, and is now introducing the technology into Peru and Brazil. With more funding, the organization would love to take the wastewater treatment system to coffee producing and processing regions in Africa and Asia.
No doubt, developing world coffee producers need the first shot at this technology: their options for fresh water are often limited. But I'd love to consider the potential for coffee processing here in the developed world, too… no doubt there's wastewater coming from our facilities that could be put to productive use.
Thoughts? Ideas? Share them with us in the comments.
The post Coffee Processing Wastewater: a Great Source of Energy appeared first on Sustainablog.
Sustainablog: In 2013, world geothermal electricity-generating capacity grew 3 percent to top 11,700 megawatts across 24 countries. Although some other renewable energy technologies are seeing much faster growth—wind power has expanded 21 percent per year since 2008, for example, while solar power has grown at a blistering 53 percent annual rate—this was geothermal's best year since the 2007-08 financial crisis.
Geothermal power's relatively slower growth is not due to a paucity of energy to tap. On the contrary, the upper six miles of the earth's crust holds 50,000 times the energy embodied in the world's oil and gas reserves. But unlike the relative ease of measuring wind speed and solar radiation, test-drilling to assess deep heat resources prior to building a geothermal power plant is uncertain and costly. The developer may spend 15 percent of the project's capital cost during test-drilling, with no guarantee of finding a viable site.
Once built, however, a geothermal power plant can generate electricity 24 hours a day with low operation and maintenance costs—importantly because there is zero fuel cost. Over the life of the generator, geothermal plants are often cost-competitive with all other power sources, including fossil fuel and nuclear plants. This is true even without considering the many indirect costs of fossil- and nuclear-generated electricity that are not reflected in customers' monthly bills.
The top three countries in installed geothermal power capacity—the United States, the Philippines, and Indonesia—account for more than half the world total. California hosts nearly 80 percent of the 3,440 megawatts of U.S. geothermal capacity; another 16 percent is found in Nevada.
Despite having installed more geothermal power capacity than any other country, the United States currently generates less than 1 percent of its electricity from the earth's heat. Iceland holds the top spot in that category, using geothermal power for 29 percent of its electricity. Close behind is El Salvador, where one quarter of electricity comes from geothermal plants. Kenya follows at 19 percent. Next are the Philippines and Costa Rica, both at 15 percent, and New Zealand, at 14 percent.
Indonesia has the most ambitious geothermal capacity target. It is looking to develop 10,000 megawatts by 2025. Having only gained 150 megawatts in the last four years, this will be a steep climb. But a new law passed by the government in late August 2014 should help move industry activity in that direction: it increases the per-kilowatt-hour purchase price guaranteed to geothermal producers and ends geothermal power's classification as mining activity. (Much of Indonesia's untapped geothermal resource lies in forested areas where mining is illegal.) Even before the new law took effect, geothermal company Ormat began construction on the world's largest single geothermal power plant, a 330-megawatt project in North Sumatra, in June 2014. The plant should generate its first electricity in 2018.
Indonesia is just one of about 40 countries that could get all their electricity from indigenous geothermal power—a list that includes Ecuador, Ethiopia, Iceland, Kenya, Papua New Guinea, Peru, the Philippines, and Tanzania. Nearly all of them are developing countries, where the high up-front costs of geothermal development are often prohibitive.
To help address this mismatch of geothermal resources and funds, the World Bank launched its Global Geothermal Development Plan in March 2013. By December, donors had come up with $115 million of the initial $500 million target to identify and fund test-drilling for promising geothermal projects in the developing world. The Bank hopes that the experience gained from these projects will lead to lower costs for the geothermal industry overall. This would be good news on many fronts—simultaneously reducing energy poverty, air pollution, carbon emissions, and costly fossil fuel imports.
For data and additional resources visit www. earth-policy.org.
The post Geothermal Power Approaches 12,000 Megawatts Worldwide appeared first on Sustainablog.
The estimated cost for decommissioning over the next century went up from a £63.8bn estimate in 2011-12 to £69.8bn in 2012-13, with more increases expected in the coming years. This hike is nearly all down to the troubled clean-up of the Sellafield nuclear facility in Cumbria, one of the world's most hazardous and fiendishly complicated decontamination sites. NMP had been accused of chronic mismanagement after a series of delays and budget overruns on Sellafield projects, including problems involved in the construction of a storage facility for radioactive sludge.
Please continue reading from: Peak Energy
John Tesvich is a fourth-generation oyster farmer in Empire, a tiny Gulf Coast enclave south of New Orleans. He's spent his life working in the rich oyster beds here, the most productive in the nation, and has weathered his share of storms: During Hurricane Katrina, his house ended up under 17 feet of water. But last week, as he navigated his 40-foot oyster boat out into open water, he admitted that the turmoil this region has faced in the last decade was beginning to wear him down.
"A lot has changed over the years," he said. "It seems like one crisis after another sometimes."
One crisis was particularly damaging to Tesvich's industry: The 2010 Deepwater Horizon oil spill. The fourth anniversary of the busted undersea well's sealing (after it gushed crude into the Gulf for nearly five months) is coming up next week, and Tesvich, who also chairs the oyster industry's main statewide lobbying group, says his crop is still struggling to rebound.
Tesvich got some good news last week, when a federal judge in New Orleans found that BP's "willful misconduct" and "gross negligence" had been the principle causes of the spill, a ruling that could eventually force BP to pay billions for ecological restoration in the Gulf. But for oystermen here, whose day-to-day income depends on these reefs, those dollars still seem very far away.
Please read full and follow at: Mother Jones
Read more of this story at Slashdot.
Sep 11, 2014
One of Australia's largest meat processors – and a major regional employer, providing 830 jobs – is among the latest recipients of funding from the Clean Energy Finance Corporation, in a deal to co-finance a major on-site energy project at northern NSW-based Bindaree Beef.
The CEFC announced on Tuesday it would provide up to $15 million, together with additional bank finance and an Australian Government Clean Technology Investment Program grant, to fund the installation of a biodigester and energy efficient rendering facilities to improve the efficiency and competitiveness of operations at Bindaree Beef.
As well as the biodigester, the funding will go towards development of an electricity generation facility using biogas (produced by the biodigester) as fuel, and a new more energy efficient rendering plant to replace the existing coal-fired plant and eliminate the use of coal. Screen Shot 2014-07-22 at 10.37.13 AM
The new equipment is expected to halve the company's power bills and cut its annual carbon emissions by three quarters. The biogas plant will also create a new business revenue stream through sales of organic fertiliser – a by-product of the energy conversion process.
Bindaree Beef Director John Newton said securing finance from the CEFC – a $10 billion Labor government initiative, which remains on the Abbott government's chopping block – had been integral to securing the interest of additional private finance, which, along with the government grant, would cover the total project cost. ...
In March this year, the CEFC contributed $20 million to a funding deal with Quantum Power Limited – Australia's leading biogas company – to catalyse up to $40 million in biogas infrastructure aimed at helping farmers and manufacturers cut costs and boost productivity in the face of rising electricity prices.
Please read full and follow at: Peak Energy
Physicist and energy expert Amory Lovins, chief scientist at The Rocky Mountain Institute, recently released a video in which he claims that renewable energy can meet all of our energy needs without the need for a fossil fuel or nuclear baseload generation. There's nothing unusual about that - many people have made that claim - but he also suggests that this can be done without a lot of grid-level storage. Instead, Lovins describes a "choreography" between supply and demand, using predictive computer models to anticipate production and consumption, and intelligent routing to deliver power where it's needed. This "energy dance," combined with advances in energy efficiency, will allow us to meet all of our energy needs without sacrificing reliability.
Okay, so there is a little storage involved: ice-storage air conditioning and smart charging of electric vehicles. But where others, including myself, have assumed that large storage devices will need to be added to the grid, Lovins thinks that massive storage facilities are unnecessary, and he presents compelling evidence to support his claim, including actual data from Europe and computer models from NREL. ...
Lovins presents this in the context of storage vs intelligent routing of electricity - which one do we need? That's a false dichotomy. There will always be a need for storage since many applications are off grid. Obviously storage is needed in order to electrify transportation. So I agree that dynamic routing is the best long term solution for the grid, but we still need to invest in storage technologies. The good thing is that both storage and smart routing can be implemented together, a little at a time, and scaled up gradually.
Please read full and follow at: Peak Energy
A Japanese joint venture is set to build what could be the world's largest floating solar project – a 2.9MW PV plant in Hyogo Prefecture, west Japan.
Japanese solar company Kyocera announced the project on its website this week, which it began developing in 2o12, in conjunction with local real estate and industry group, Century Tokyo Leasing, shortly after the introduction of Japan's solar feed-in tariff (FiT). The two companies have already developed 92.8MW of PV across 28 locations in Japan, of which 21.6MW is now online at 11 plants, according to Kyocera, and plan to develop around 60MW of floating PV on roughly 30 sites by May 2015.floating_pv_kyocera_200_150_s_c1
This latest floating solar project will consist of two arrays – one 1.7MW, making it the world's largest floating solar plan, and one 1.2MW – which are designed to float on the surface of reservoirs. The use of floating solar technology addresses both the energy deficits created by Japan's shift away from nuclear, as well as its chronic shortage of land on which to build large-scale solar projects.
Please read full and follow at: Peak Energy
As my mind began to spin down, I discovered that calm was like a drug. It felt so good, so decadent, just to sit in the early afternoon with my feet propped on the windowsill, watching wind brush the trees in the front yard. I was hooked.
In December, I called psychology professor and researcher Larry D. Rosen, author of iDisorder: Understanding Our Obsession with Technology and Overcoming Its Hold on Us. "I could put an EEG tap on your head and measure the activity while you're sitting at your computer," he said, "and then I could have you go take a walk. What I would likely see is your brain activity diminish rapidly." What this suggests, he said, is that "technology is highly overloading our brains" and, conversely, that "certain things calm our brains." Simple enough.
Rosen mentioned taking lots of short breaks, finding offline social groups, and, of course, meditation, but I kept coming back to walking. Just before I started my sabbatical, my wife bought me one of those wristband fitness trackers that count your steps. (The absurdity of wiring myself for a break from technology did not escape me.) It comes with a built-in goal of 10,000 steps a day—about five miles. Running, you could do that in 40 minutes, but I loathe running with great fervor, so I walked. My dog Forest and I have since logged 1,400 miles on winding urban hikes through Seattle's tucked-away paths, stairways, and parks. That's 2,723,487 steps, but who's counting?
My rambles have taken me through many miles of greenspace, which, as scientists are belatedly discovering, is a kind of wonder drug itself, with many of the same benefits as meditation. When I chatted with researcher and naturopathic physician Alan Logan, coauthor of 2012's Your Brain on Nature, he described experiments in which cognitively fatigued subjects are taken on a walk, some through a concrete environment, some through urban greenspace. "You come back and you repeat the cognitive testing," he said, "and whether it's memory recall, target identification, or your attention overall, it's consistently far better after having taken a nature walk."
What's going on? Nature provides what University of Michigan psychologist Stephen Kaplan has termed soft fascinations. (Dibs on the band name.) We are shaped by evolution to heed the ebb and flow of drifting clouds, rustling grass, and singing birds. Unlike voluntary or directed attention—the kind required by, say, a spreadsheet—"effortless attention" produces no fatigue. It's the mental equivalent of floating on your back, and a rested mind is a more productive mind.
It’s (still) a materialist world, F-150 pick-up, which is its most popular model today, weighs more than 2 tonnes.
The guru of modern thinking about the significance of materials is Professor Vaclav Smil of the University of Manitoba, described by Bill Gates as "my favourite author". In his view, physical substances remain central to modern economies in spite of all the advances in information technology, and the apparent evidence of dematerialisation is often misleading.Please read full and follow at: Peak Energy
In his latest book, Making The Modern World, he cites computer-aided design. The Boeing 747, designed in the 1960s, required 75,000 drawings with a total weight of 8 tonnes. Using computer-aided design (CAD) for the 767 in the 1990s did away with all that paper, and cut costs and design time.
However, as Prof Smil points out, the CAD system required computers, data storage, communications, screens and electricity to run. Given the complexity of the systems involved, it is far from obvious that the switch to CAD cut United States use of materials overall.
It is true that in computing power there has been spectacular dematerialisation. All the computers sold in the world in 2011 weighed 60 times as much as the total sold in 1981, but had 40 million times the memory.
But where microchips are not the dominant component of the total design, Prof Smil wrote, there has been no even remotely similar mass decline. In some sectors, technological progress has actually made products more "material". The Ford Model T, one of the first automobiles, weighed 540kg; the F-150 pick-up, which is its most popular model today, weighs more than 2 tonnes.
Prof Smil's conclusion is that while dematerialisation, in the sense of reduced material use for every dollar of gross domestic product, has been a trend for decades and can continue into the future, an absolute reduction in the world's use of natural resources is highly unlikely. If growth continues, at some point those resources will run low.
While we do not know when we will hit the limits of materials usage, we know they are out there somewhere. Tensions such as the dispute over rare earths or rising commodity costs could have serious consequences for growth.
Prof Smil's answer is that we need to think about rational futures of moderated energy and material use. As he admits, though, it is hard to see any political leaders being prepared to offer their citizens less and less in the future; particularly not in emerging economies where billions are hoping to come closer to developed world lifestyles.
"Extremophile" bacteria have been found thriving in soil samples from a highly alkaline industrial site in Peak District of England. Although the site is not radioactive, the conditions are similar to the alkaline conditions expected to be found in cement-based radioactive waste sites. The researchers say the capability of the bacteria to thrive in such conditions and feed on isosaccharinic acid (ISA) make it a promising candidate for aiding in nuclear waste disposal... Continue Reading Newly-discovered waste-eating bacteria could help in nuclear waste disposal
Tags: Bacteria, Nuclear, Radioactivity, University of Manchester, Waste
- Hitachi developing reactor that burns nuclear waste
- Intelligent absorbent removes radioactive material from water
- Graphene oxide causes radioactive material to "clump" out of water
- Sapphire disks could communicate with future generations 10 million years from now
- Old toilets recycled into new "green" cement
- European Synchrotron Radiation Facility succesfully analyzes zeolites
Sep 10, 2014
Please continue reading from: Yale Environment 360