
"Chemical safety is a major priority of EPA and its research," said Dr. Paul Anastas, assistant administrator of EPA's Office of Research and Development. "These databases provide the public access to chemical information, data and results that we can use to make better-informed and timelier decisions about chemicals to better protect people's health."
ToxCastDB users can search and download data from over 500 rapid chemical tests conducted on more than 300 environmental chemicals. ToxCast uses advanced scientific tools to predict the potential toxicity of chemicals and to provide a cost-effective approach to prioritizing which chemicals of the thousands in use require further testing. ToxCast is currently screening 700 additional chemicals, and the data will be available in 2012.
ExpoCastDB consolidates human exposure data from studies that have collected chemical measurements from homes and child care centers. Data include the amounts of chemicals found in food, drinking water, air, dust, indoor surfaces, and urine. ExpoCastDB users can obtain summary statistics of exposure data and download datasets. EPA will continue to add internal and external chemical exposure data and advanced user interface features to ExpoCastDB.
The new databases link together two important pieces of chemical research—exposure and toxicity data—both of which are required when considering potential risks posed by chemicals. The databases are connected through EPA's Aggregated Computational Toxicology Resource (ACToR), an online data warehouse that collects data on over 500,000 chemicals from over 500 public sources.
In the 1940s and ’50s dichlorodiphenyltrichloroethane, a synthetic pesticide better known as DDT, was used to kill bugs that spread malaria and typhus in several parts of the world. DDT was argued to be toxic to humans and the environment in the famous environmental opus, Silent Spring. It was banned by the U.S. government in 1972.
Before all that, though, it was sprayed in American neighborhoods to suppress insect populations. The new movie Tree of Life has a great scene re-enacting the way that children would frolick in the spray as the DDT trucks went by. Here are two screen shots from the trailer:
The scene reminded me of an old post we’d written, below, featuring advertisements for the pesticide, one with the ironic slogan “DDT is good for Me-e-e!”
“The great expectations held for DDT have been realized. During 1946, exhaustive scientific tests have shown that, when properly used, DDT kills a host of destructive insect pests, and is a benefactor of all humanity.
Pennsalt produces DDT and its products in all standard forms and is now one of the country’s largest producers of this amazing insecticide. Today, everyone can enjoy added comfort, health and safety through the insect-killing powers of Pennsalt DDT products . . . and DDT is only one of Pennsalt’s many chemical products which benefit industry, farm and home.
GOOD FOR FRUITS – Bigger apples, juicier fruits that are free from unsightly worms . . . all benefits resulting from DDT dusts and sprays.
GOOD FOR STEERS – Beef grows meatier nowadays . . . for it’s a scientific fact that compared to untreated cattle beef-steers gain up to 50 pounds extra when protected from horn flies and many other pests with DDT insecticides.
FOR THE HOME – Helps to make healthier and more comfortable homes . . . protects your family from dangerous insect pests. Use Knox-Out DDT Powders and Sprays as directed . . . then watch the bugs ‘bite the dust’!
FOR DAIRIES – Up to 20% more milk . . . more butter . . . more cheese . . . tests prove greater milk production when dairy cows are protected from the annoyance of many insects with DDT insecticides like Knox-Out Stock and Barn Spray.
GOOD FOR ROW CROPS – 25 more barrels of potatoes per acre . . . actual DDT tests have shown crop increases like this! DDT dusts and sprays help truck farmers pass these gains along to you.
FOR INDUSTRY – Food processing plants, laundries, dry cleaning plants, hotels . . . dozens of industries gain effective bug control, more pleasant work conditions with Pennsalt DDT products.
Researchers say the material could potentially be used to capture waste heat from a car's exhaust that would heat the material and produce electricity for charging the battery in a hybrid car. Other possible future uses include capturing rejected heat from industrial and power plants or temperature differences in the ocean to create electricity. The research team is looking into possible commercialization of the technology.
"This research is very promising because it presents an entirely new method for energy conversion that's never been done before," said University of Minnesota aerospace engineering and mechanics professor Richard James, who led the research team."It's also the ultimate 'green' way to create electricity because it uses waste heat to create electricity with no carbon dioxide."
To create the material, the research team combined elements at the atomic level to create a new multiferroic alloy, Ni45Co5Mn40Sn10. Multiferroic materials combine unusual elastic, magnetic and electric properties. The alloy Ni45Co5Mn40Sn10 achieves multiferroism by undergoing a highly reversible phase transformation where one solid turns into another solid. During this phase transformation the alloy undergoes changes in its magnetic properties that are exploited in the energy conversion device.
During a small-scale demonstration in a University of Minnesota lab, the new material created by the researchers begins as a non-magnetic material, then suddenly becomes strongly magnetic when the temperature is raised a small amount. When this happens, the material absorbs heat and spontaneously produces electricity in a surrounding coil. Some of this heat energy is lost in a process called hysteresis...a thin film of the material that could be used, for example, to convert some of the waste heat from computers into electricity.
"This research crosses all boundaries of science and engineering," James said. "It includes engineering, physics, materials, chemistry, mathematics and more. It has required all of us within the university's College of Science and Engineering to work together to think in new ways."
Read full at PhysOrg
The raw biogas (methane: 60%; carbon dioxide and other gases: 40%) is produced from food residue at the methane fermentation plant of Bio Energy in Ota Ward, Tokyo, the largest food residue methane fermentation plant in Japan.
Read more from Japan for SustainabilityTwo new studies found that diet drinks and artificial sweeteners increase people's waistlines and increase their risk of diabetes.
From ScienceDaily:
Measures of height, weight, waist circumference and diet soda intake were recorded at SALSA enrollment and at three follow-up exams that took place over the next decade. The average follow-up time was 9.5 years... Diet soft drink users, as a group, experienced 70 percent greater increases in waist circumference compared with non-users. Frequent users, who said they consumed two or more diet sodas a day, experienced waist circumference increases that were 500 percent greater than those of non-users.
Abdominal fat is a risk factor for several conditions, including diabetes, heart disease, and cancer. The researchers say this finding shows that national campaigns against sugary drinks should emphasize that replacing them with diet soft drinks won't necessarily make you healthier.
The report doesn't weigh in on whether a raging Diet Coke addition is preferable to a regular Coke addiction, but it goes on to note that the artificial sweetener aspartame is also bad news if you're concerned about diabetes. In a related study, a group of mice were fed a high-fat diet including the chemical for three moths. Compared to the control group, the aspartame-consuming mice had elevated fasting glucose levels and equal or diminished insulin levels. Co-author Dr. Gabriel Fernandes explains:
"These results suggest that heavy aspartame exposure might potentially directly contribute to increased blood glucose levels, and thus contribute to the associations observed between diet soda consumption and the risk of diabetes in humans."
Da Vinci Co., a start-up company specializing in heat control technologies in Nara Prefecture, Japan., has been developing a system that recovers low-temperature waste heat and generates electricity using a rotary heat engine (RHE). The utilization of low-temperature waste heat of 80 to 200 degrees Celsius has long been considered technologically unfeasible. The company intends to complete a 30-kilowatt system by May 2011, start production around December after an durability test, and commence sales in March 2012.
Da Vinci's RHE is an external-combustion type, Wankel-based rotary heat engine driven by the Rankine cycle. Da Vinci has developed RHEs jointly with the University of Tokyo, completing a 500-watt model in June 2009.
The RHE consists of a rotor, power generator, evaporator, condenser, and working fluid. External waste heat applied to the engine vaporizes the working fluid in the evaporator, which rotates the rotor, generating electricity with a directly-coupled generator. Gas that comes out of the rotor turns back into fluid in the condenser, from which point it returns to the evaporator.
Different working fluids including water, ethanol, and ammonia can be chosen for different waste heat temperatures. The system harnesses temperature differentials near the boiling point of the working fluid to effectively recycle low-temperature waste heat, and can be applied to generate power from waste heat recovered in factories, co-generation systems, and boilers. Read more from Japan for Sustainability
Those interested in EIA’s views on shale gas, which differ in significant respects from those outlined in the June 27 article, may want to review the EIA response to the inquiry from the Times, the Issues in Focus discussion of shale gas included in the Annual Energy Outlook 2011, and a recent presentation on domestic and international shale gas.
There are approximately 160 million set-top boxes installed in US homes, or the equivalent of one box for every two Americans.
These boxes consume as much electricity each year as that consumed by the entire state of Maryland...
The NRDC study, Reducing the National Energy Consumption of Set-Top Boxes, also found that today's average new cable high-definition digital video recorder (HD-DVR) consumes more electricity annually than the new flat panel TV to which it's typically connected and about 40% more than its basic set-top box counterpart. In contrast, cell phones, which also work on a subscriber basis with a need for secure connections, are able to use extremely low levels of power when not in use – primarily to preserve battery life.Full Report Here
HackDay - While the Isle of Man typically plays host to an array of gas-powered superbikes screaming through villages and mountain passes at unbelievable speeds, the island's TT Race is a bit different. Introduced in 2009 to offer a greener alternative to the traditional motorcycle race, organizers opened up the course to electric bikes of all kind...Their entry into the race is the brainchild of PhD student [Lennon Rodgers] and his team of undergrads. They first designed a rough model of the motorcycle they wanted to build in CAD, and through a professor at MIT sourced some custom-made batteries for their bike.
While the team didn't take the checkered flag, they did finish the race in 4th place. Their bike managed to complete the course with an average speed of 79 mph, which isn't bad according to [Rodgers]. He says that for their first time out, he's happy that they finished at all, which is not something every team can claim. Read full at HackDay
GizMag - Lately we're hearing a lot about the green energy potential of fuel cells, particularly hydrogen fuel cells. Unfortunately, although various methods of hydrogen production are being developed, it still isn't as inexpensive or easily obtainable as fossil fuels such as coal. Scientists from the Georgia Institute of Technology, however, have recently taken a step towards combining the eco-friendliness of fuel cell technology with the practicality of fossil fuels - they've created a fuel cell that runs on coal gas.
For some time now, it has been possible to operate solid oxide fuel cells using hydrocarbons. Those cells typically conk out in as little as half an hour, however, because carbon deposits have formed on their anodes in a clogging process known as "coking."
The Georgia Tech researchers have devised a vapor-deposition technique for growing nanostructures from barium oxide nanoparticles on those anodes. By absorbing moisture, the structures start a water-based chemical reaction, that oxidizes carbon deposits as they form. Using this approach, the team have been able to run coal gas-powered fuel cells for up to 100 hours, with no signs of carbon deposits.
Read more at GizMag also the research was recently published in the journal Nature Communications.
“A lot of people are going to have things break and they’re not going to know why,” said Demetrios Matsakis, head of the time service department at the U.S. Naval Observatory, one of two official timekeeping agencies in the federal government.
Officials say they want to try this to make the power supply more reliable, save money and reduce what may be needless efforts. The test is tentatively set to start in mid-July, but that could change. - Associated Press
At first glance, the story looks credible. And scary. The information comes from a physician, Janette Sherman MD, and epidemiologist Joseph Mangano, who got their data from the Centers for Disease Control and Prevention's Morbidity and Mortality Weekly Reports—a newsletter that frequently helps public health officials spot trends in death and illness.
Look closer, though, and the credibility vanishes. For one thing, this isn't a formal scientific study and Sherman and Mangano didn't publish their findings in a peer-reviewed journal, or even on a science blog. Instead, all of this comes from an essay the two wrote for Counter Punch, a political newsletter. And when Sci Am's Moyer holds that essay up to the standards of scientific research, its scary conclusions fall apart.
...why did the authors choose to use only the four weeks preceding the Fukushima disaster? Here is where we begin to pick up a whiff of data fixing. ... While it certainly is true that there were fewer deaths in the four weeks leading up to Fukushima than there have been in the 10 weeks following, the entire year has seen no overall trend. When I plotted a best-fit line to the data, Excel calculated a very slight decrease in the infant mortality rate. Only by explicitly excluding data from January and February were Sherman and Mangano able to froth up their specious statistical scaremongering.
You can see that data all plotted out nicely by Moyer over at Sci Am. How did these numbers get so heinously distorted? It's hard to say. But there should be some important lessons here. In particular, this is a good reminder that human beings do not always behave the way some economists think we do. We're not totally rational creatures. And profit motive is not the only factor driving our choices.
A new strain of scarlet fever that's about twice as resistant to antibiotics as previous antibiotic-resistant strains. This is heart-wrenching. My thoughts are with the families in Hong Kong suffering through this outbreak.
Date: Tuesday, June 28, 2011
Time: 1:00 PM - 2:00 PM CDT
Reserve your Webinar seat now at:
https://www1.gotomeeting.com/register/762297377
Please register today and pass this invitation on to those who could benefit.
Also, see booklet with case studies on these areas plus water conservation, Easy Material and Energy Savings,
http://nbdc.unomaha.edu/energy/energy_book.pdf
Additional statistics and an interactive tool to investigate the economic and environmental impacts of shutting nuclear power plants in the US can be downloaded here
Net farm income is expected in 2011 to reach its highest levels in more than three decades, as a rapidly growing and food-short world increasingly looks to the United States to provide it everything from soybeans and wheat to beef and fruit. Somebody should explain that good news to the Department of Agriculture: This year it will give a record $20 billion in various crop "supports" to the nation's wealthiest farmers -- with the richest 10 percent receiving over 70 percent of all the redistributive payouts. If farmers on their own are making handsome profits, why, with a $1.6 trillion annual federal deficit, is the Department of Agriculture borrowing unprecedented amounts to subsidize them?
At least $5 billion will be in direct cash payouts. Yet no one in the USDA can explain why cotton and soybeans are subsidized, but not lettuce or carrots. In fact, 70 percent of all subsidies go to corn, wheat, cotton, rice and soybean farmers. Most other farmers receive no federal cash. Yet somehow peach, melon and almond growers seem to be doing fine without government checks in the mail.
Then there is the more than $5 billion in ethanol subsidies that goes to the nation's corn farmers to divert their acreage to produce transportation fuel. That program has somehow managed to cost the nation billions... read more from source
“Fukushima has now been classified as the biggest industrial catastrophe in the history of mankind,” with amount of radioactive fuel at Fukushima dwarfs Chernyobyl.
Arnie Gundersen has said that Fukushima is the worst industrial accident in history, and has 20 times more radiation than Chernobyl.
Well-known physicist Michio Kaku just confirmed all of the above in a CNN interview:
In the last two weeks, everything we knew about that accident has been turned upside down. We were told three partial melt downs, don’t worry about it. Now we know it was 100 percent core melt in all three reactors. Radiation minimal that was released. Now we know it was comparable to radiation at Chernobyl.
The scientific journal Nature notes:The computer simulation by researchers at Kyushu University and the University of Tokyo, among other institutions, calculated dispersal of radioactive dust from the Fukushima plant beginning at 9 p.m. on March 14, when radiation levels around the plant spiked.
The team found that radioactive dust was likely caught by the jet stream and carried across the Pacific Ocean, its concentration dropping as it spread. According to the computer model, radioactive materials at a concentration just one-one hundred millionth of that found around the Fukushima plant hit the west coast of North America three days later, and reached the skies over much of Europe about a week later.
Shortly after a massive tsunami struck the Fukushima Daiichi nuclear power plant on 11 March, an unmanned monitoring station on the outskirts of Takasaki, Japan, logged a rise in radiation levels. Within 72 hours, scientists had analysed samples taken from the air and transmitted their analysis to Vienna, Austria — the headquarters of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), an international body set up to monitor nuclear weapons tests.
It was just the start of a flood of data collected about the accident by the CTBTO's global network of 63 radiation monitoring stations. In the following weeks, the data were shared with governments around the world, but not with academics or the public.
This week, officials have gathered at the National Press Club in Washington DC to ask themselves a simple question: What if it happens again?
"A similar storm today might knock us for a loop," says Lika Guhathakurta, a solar physicist at NASA headquarters. "Modern society depends on high-tech systems such as smart power grids, GPS, and satellite communications--all of which are vulnerable to solar storms."
She and more than a hundred others are attending the fifth annual Space Weather Enterprise Forum—" SWEF" for short. The purpose of SWEF is to raise awareness of space weather and its effects on society especially among policy makers and emergency responders. Attendees come from the US Congress, FEMA, power companies, the United Nations, NASA, NOAA and more.
As 2011 unfolds, the sun is once again on the eve of a below-average solar cycle—at least that’s what forecasters are saying. The "Carrington event" of 1859 (named after astronomer Richard Carrington, who witnessed the instigating flare) reminds us that strong storms can occur even when the underlying cycle is nominally weak.
In 1859 the worst-case scenario was a day or two without telegraph messages and a lot of puzzled sky watchers on tropical islands.
In 2011 the situation would be more serious....
An avalanche of blackouts carried across continents by long-distance power lines could last for weeks to months as engineers struggle to repair damaged transformers. Planes and ships couldn’t trust GPS units for navigation. Banking and financial networks might go offline, disrupting commerce in a way unique to the Information Age. According to a 2008 report from the National Academy of Sciences, a century-class solar storm could have the economic impact of 20 hurricane Katrinas.
As policy makers meet to learn about this menace, NASA researchers a few miles away are actually doing something about it:
"We can now track the progress of solar storms in 3 dimensions as the storms bear down on Earth," says Michael Hesse, chief of the GSFC Space Weather Lab and a speaker at the forum. "This sets the stage for actionable space weather alerts that could preserve power grids and other high-tech assets during extreme periods of solar activity."
... This kind of "interplanetary forecast" is unprecedented in the short history of space weather forecasting.
"This is a really exciting time to work as a space weather forecaster," says Antti Pulkkinen, a researcher at the Space Weather Lab. "The emergence of serious physics-based space weather models is putting us in a position to predict if something major will happen."
Read more at NASA and more information may be found at the SWEF 2011 home page.
“Smoking certainly is a major cardiovascular risk factor and sitting can be equivalent in many cases,” explained Dr. David Coven.
Dr. Coven is a cardiologist. He says several new studies show prolonged sitting is now being linked to increased risk of heart disease, obesity, diabetes, cancer, and even early death.
“The fact of being sedentary causes factors to happen in the body that are very detrimental,” said Dr. Coven.
Dr. Coven says when you sit for long periods of time; your body goes into storage mode,
When that happens, it stops working as effectively as it should. What’s worse, the more hours a day you sit, the greater your likelihood of developing one or more of these diseases, just as with smoking.
Linda Caufield has a desk job. She sits nearly seven hours a day.
“I’m on the computer, I’m on the phone, I’m doing paperwork so all that stuff has to be done at my desk,” she said.
So get up at the office at any chance you get, don’t send emails when you can deliver the message in person, take the stairs, stand up when you take a phone call. And don’t forget to take your breaks, and take a walk.
To view research from American College of Cardiology click here.
A wee particle accelerator in the English countryside could be a harbinger of a safer, cleaner future of energy. Specifically, nuclear energy, but not the type that has wrought havoc in Japan and controversy throughout Europe and the U.S. It would be based on thorium, a radioactive element that is much more abundant, and much more safe, than traditional sources of nuclear power.
Some advocates believe small nuclear reactors powered by thorium could wean the world off coal and natural gas, and do it more safely than traditional nuclear. Thorium is not only abundant, but more efficient than uranium or coal - one ton of the silver metal can produce as much energy as 200 tons of uranium, or 3.5 million tons of coal, as the Mail on Sunday calculates it.
The newspaper took a tour of a small particle accelerator that could be used to power future thorium reactors. Nicknamed EMMA - the Electron Model of Many Applications - the accelerator would be used to jump-start fissile nuclear reactions inside a small-scale thorium power plant.Thorium reactors would not melt down, in part because they require an external input to produce fission. Thorium atoms would release energy when bombarded by high-energy neutrons, such as the type supplied in a particle accelerator.
Providing that stimulus is one obstacle to building small thorium reactors - but a new generation of accelerators like EMMA, and someday potentially even smaller, luggage-sized ones - could do the job.
EMMA is the first non- scaling, fixed-field, alternating-gradient (NS-FFAG) accelerator, qualities that make it easier to operate and maintain, more reliable and compact, more flexible and more efficient, according to British researchers. Other particle accelerators use alternating electric fields, which require special safety measures to guard against microwave exposure, for instance. EMMA's alternating magnetic field gradients are a more efficient and cheaper way to accelerate particles to higher energies. (Brookhaven National Laboratory explains in more detail here.)
EMMA operates at operates around 20 MeV, or 20 million electronvolts, a paltry amount for an atom accelerator. The Tevatron, for instance, accelerates particles to 1 tera-electron volts. The Large Hadron Collider is designed to speed them to 7 TeV. But thorium reactors would not need such high energies to initiate fusion.
British scientists are already working on a successor called PAMELA, the Particle Accelerator for Medical Applications, which will be used to treat cancer.
Click through to the Mail for a full tour of EMMA, its sister apparatus ALICE (Accelerators and Lasers In Combined Experiments), and a description of British efforts to produce thorium power.
My favorite comment from post:
Every couple of months we have a new technology that is going to revolutionise energy generation and save the planet (fusion reactors, genetically engineered algae - producing ethanol, high flying wind turbines, now this - thorium reactors, probably a load of other ideas ...). We also have some of the best wind and tidal resources in the world. Why on earth aren't we blazing away and revelling in cutting edge technology and embracing the potential that we have? Ah yes, it's because our glorious leaders are too incompetent and incapable of inspiring us. Plus their close friends in big business would rather we carried on paying extortionate prices, through a policy of scare mongering and the maintenance of cartels. To summarise; the rich carry on getting ever richer at the expense of the rest of us. Never mind, China or the USA will develop the technology and then sell it back to us at the usual insane price that we'll be happy to pay.- The von Horn, Shambury, Oxfordshire
Excerpt: An analysis of the terms of trade index for the United States during the past 20 years indicates that the index has tended to vary by less than 10 percent.
A spike in the import price index in 2008 (driven by sharply higher prices for fuels) was offset to some extent by a jump in the export price index (related to higher grain prices). (See chart 1.)
Given the recent surge in market prices for petroleum as well as grains, the terms of trade for the United States will bear watching.
Also see WSJ - "The Economy Is Worse Than You Thin"“There is a storm developing in agriculture,” said Jean Bourlot, global head of commodities at UBS AG in London. “If we have the slightest disruption in any part of the world, the effect on the price will be considerable.”
Please read more at BloomBerg
This voluntary standard, developed by a project committee of 45 partnering countries from the International Organization for Standardization (ISO), provides organizations with a framework for continuous energy performance improvements. The framework will encourage adoption of best practices that reduce the energy use of existing equipment and facilities, require the use of energy performance data to target cost-effective upgrades, and emphasize the design and installation of highly efficient energy systems and equipment. By increasing their operational efficiency, organizations that adopt the ISO 50001 standard will save money by saving energy.
Please see full story at EERE
How Much Gas Is Left?
Natural gas – how much is left, who’s got it and how supplies oddly mirror certain geo-politic tensions. Another little interactive visual of ours.
The water crisis in Texas, the biggest oil- and gas-producing state in the U.S., highlights a continuing debate in North America and Europe over fracking's impact on water supplies. Environmentalists say the method poses a contamination threat, while farmers face growing competition for scarce water.
Fracking is a 60-year-old method of shattering rocks to unleash oil and natural gas with high-pressure jets of sand- and chemical-infused water. In the past decade, the technique has been refined and coupled with new ways of drilling sideways through oil-rich shale formations, spurring an onshore exploration boom, says Robert Ineson, senior director of global gas at researcher IHS CERA.
"It's a heavy industrial activity with massive amounts of water and chemicals."
The Eagle Ford's peculiar geology means each well fracked requires an amount of water equivalent to that used by 240 adults in an entire year for cooking, washing, and drinking. A study by the Texas Water Development Board and the University of Texas at Austin's Bureau of Economic Geology estimates fracking-water demand in the area will jump tenfold by 2020 and double again by 2030. "This is not the drilling your grandparents knew in West Texas," says Sharon Wilson, an organizer for Earthworks' Oil & Gas Accountability Project, which lobbies for tougher regulation of oil drillers.
For now, local water departments, farmers, and oil drillers near Laredo are relying on water from two reservoirs and underground aquifers filled by last summer's tropical storm season. But that won't last forever.