Public transport has a problem with money. Campaigners often argue that mass transit is a public good in its own right, and hence should be very cheap or even free.
Mainstream media and even many self-proclaimed supporters of public transport run emotional stories about fare increases, while governments offer “giveaway” fare policies that severely restrict transport revenue.
Trains and buses are widely derided as “lower-class” forms of transport, despite being one of society’s most important enablers for economic development and social engagement.
Together, these lines of thinking add up to a severe lack of funding for transport in Australian cities – both in terms of infrastructure investment and operating revenues.
Spending on public transport is routinely delayed, while urban roads are given priority. Meanwhile, the fluster over fares means that operators aren’t receiving the funds they need to deliver a good service.
Until we face up to these issues, the struggle to provide better public transport will be an uphill battle.
What do the world’s best mass transit networks have in common? They have fare structures that let them recover a large proportion (typically 75% or more) of their operating costs.
Designing a smart transport policy means accepting that there are limits to what governments can provide. So it’s best to use any available subsidy wisely. We should increasingly look to passengers to cover much of the cost of the transit services that benefit them directly.
We tend to think about fares in an illogical way. Australian media reports bemoaning expensive fares routinely cite the cost per trip, when they should quote the cost per kilometre.
For example, the much-maligned trip from the Gold Coast to Brisbane covers up to 80 km and costs A$14. Contrast that with the UK, where travelling the 71 km from London to Milton Keynes will cost you at least £18.50 (A$33.50).
I would suggest that Britain’s fares demonstrate a more realistic understanding of the limited resources available for funding rail travel.Huge subsidies
In Australia, residents of outer suburban areas receive large subsidies for trips that would be rightly regarded as regional journeys overseas. Meanwhile, most passengers in Sydney, Brisbane or Melbourne who travel short hops of just a few kilometres get little or no subsidy.
This unbalanced approach distorts our ability to provide better services. There is little relationship between the price of tickets and the cost per kilometre of transporting those passengers.
At this point the argument becomes tangled. Self-proclaimed supporters of transit bang the table and shout that outer suburban travellers need to be given perks to stop them defecting to their cars.
But why should someone living in Broadmeadows, 16 km north of central Melbourne, receive less subsidy than someone from Frankston, 41 km southeast of the CBD? Distance from the CBD is not a proxy for disadvantage, nor for deservedness of subsidy.
Disadvantage should primarily be addressed through concession fare offerings and workable transit access, not by a crude system of frittering public resources on cheaper fares for long distances.
It would be much better to attempt to devise a fare structure based on equality of subsidy, which reflects the reality of how much it costs to provide these services.
These are complex questions. Successful transport networks overseas treat them as such, but our state politicians continue to play immature games with ticket prices.
Recent examples include the move towards “free” CBD travel in Melbourne, where it is far from clear that a majority of users want free travel anyway.
The Independent Pricing and Regulatory Tribunal has recommended that Sydney’s train network should be paid for by passengers – largely white-collar workers who want good service rather than cheaper fares – instead of by the wider public.
But still the policy discussion focuses exclusively on “affordability”, while state governments bemoan their inability to fund transport upgrades. This shows a pretty poor grasp of basic infrastructure economics.
If we took a better approach to transport funding, we could deliver new projects much faster and more affordably. Analysis shows that asking the beneficiaries to pay for new services (“value capture”, to use the policy jargon) can cover 25-50% of project costs
Rather than throw A$9-11 billion at a single rail project in Melbourne, why not use value capture to deliver an entire suite of generational rail upgrades?
This approach has worked in mega-suites such as London’s Crossrail. In Los Angeles' “30-10” initiative, what was previously seen as 30 years' worth of transport infrastructure will be delivered in around a decade.
Melbourne could do a “30-12” and deliver Metro One, the airport link, and the Rowville and Doncaster rail projects in a 10-12 year program, from that same A$10 billion consolidated revenue base - with the help of alternative funding approaches.
But this can only happen if we adopt the attitude to funding seen in many places overseas, and crucially, only if there is a genuine interest to see these rails programs realised.
This is where the real difficulty starts. As a specialist observer of transit projects and their funding, I believe that institutional roadblocks are our biggest problem in Australia.
If our leaders among government, the public service, and the transport industry didn’t actually want better mass transit, then fudging around on fare structures and capital funding options would be a really effective way to delay progress.
Conversely, if better transit is the biggest game in town, it’s time we got serious about it. To fix and expand our rail systems we need to get fares right and we need to diversify the funding mix - because there’s just no other way to get where we want to go.
Chris Hale operates his own consulting business and this article is his last as an academic at the University of Melbourne. He has recently contracted to Melbourne City Council on the issue of transit funding, and in years past also contracted to the Cross River Rail and Sydney Metro projects on the same topic. He currently receives no research funding on this topic, but between 2009 and 2011 was the recipient of grants for research into mass transit funding from the Australian CRC for Rail Innovation. Chris has been the recipient of substantial in-kind support (such as access to data and availability of senior staff time) over many years on this topic from MTR of Hong Kong.
Lead pollution from Australia reached Antarctica in 1889 – long before the frozen continent’s golden age of exploration – and has remained there ever since, new research shows.
In our study, published in Nature Scientific Reports, my colleagues and I used ice core samples from West and East Antarctica to reveal the continent’s long and persistent history of heavy metal pollution.
The Antarctic remains the most remote and pristine place on Earth. Yet despite its isolation, our findings show that it has not escaped contamination from traces of industrial lead, a serious pollutant and neurotoxin. The levels of lead pollution found in the ice cores is too low to impact Antarctic ecosystems, but higher levels would be expected closer to sources.Isolated outpost
Antarctica’s isolation gives us a unique vantage point to investigate large-scale changes in the Earth system, and the influence of humanity.
The new study, led by Dr Joe McConnell of the Desert Research Institute in Nevada, used an array of Antarctic ice cores to reveal a detailed record of where and when pollution can be found.
The first trace of lead pollution arrived in Antarctica around 1889, 22 years before the Amundsen and Scott expeditions to the South Pole.
We also discovered that lead pollution in the Antarctic peaked twice, and that in both cases Australia was the primary source.
After an initial peak in the late 1920s, lead levels dropped in sync with the Great Depression and Second World War. The pollution peaked again in about 1975.
Today, although levels are lower than at the 1975 peak, they remain at roughly three times the pre-industrial level.Straight from source
How do we know where this pollution came from? Lead ore deposits contain a unique combination of lead isotopes (atoms with the same number of protons, but different number of neutrons) that can be used like a fingerprint to determine the original sources and how different sources are mixed.
Lead from the Antarctic samples has the same fingerprint as lead from Broken Hill, New South Wales – an old mining town with significant lead deposits.
We used ice cores from many locations, spanning several thousand kilometres, where previously only a handful of records were available, with long periods missing from the historical record.
The cores had to be shipped to the United States to be analysed, although future studies can be carried out in Australia, at the new Trace Research Advanced Clean Environment (TRACE) facility, which will allow us to detect the presence of miniscule amounts of contamination.
More analysis will help us unlock more of Antarctica’s secrets. If you’ll excuse the pun, our latest results are just the tip of the iceberg with regard to information stored in the Antarctic ice sheet.
For example, fires in the Southern Hemisphere have left traces in the ice and a history of climate. The history of persistent organic pollutants and mercury in the remote south are still poorly known. Colleagues at CSIRO and the Australian Nuclear Science Technology Organisation are using ice cores to understand the past variability of greenhouse gases and the Sun. Combined with records from tree rings, sediments and caves, ice cores help to recreate a large-scale reconstruction of past sea level pressure.
Meanwhile, Antarctica continues to serve as a sentinel for unintended consequences of human activities – in this case, the pollution of a pristine frozen wasteland by an Australian mining product.
Ross Edwards works for Curtin University. He receives funding from the Australian Research Council, Australian Antarctic Science grants and the Agilent Foundation.
Researchers at the Massachusetts Institute of Technology unveiled a new material this week that provides a highly efficient way to convert sunlight into steam and holds major potential for improving technologies like desalination of water and solar thermal power — all with a four-inch graphite ‘sponge.’
The setup developed by MIT consists of a layer of graphite flakes and carbon foam beneath that. It’s porous, which enables the disc to float on water, and the dark color of the graphite attracts maximum energy from the sun.
The end result is a system that converts 85 percent of incoming solar energy into steam — far more efficient than previous methods. “Basically, if you heat up the whole volume of the water, you don’t raise the temperature very much,” Gang Chen, a professor of mechanical engineering at MIT, told ThinkProgress. “However, if you only heat up a small amount of water, then the temperature rise could be high.”
Chen explained that by floating the graphite on the surface of the water, the researchers were able to concentrate the maximum amount of incoming sunlight and adding the foam to the bottom provided a further layer of insulation.
The ability to create steam quickly and efficiently, with inexpensive materials and using only sunlight, has tremendous implications. “Steam is important for desalination, hygiene systems, and sterilization,” said Hadi Ghasemi, a postdoc in MIT’s Department of Mechanical Engineering who led the development of the structure. “Especially in remote areas where the sun is the only source of energy, if you can generate steam with solar energy, it would be very useful.”
Chen said there are two potential applications he’s particularly excited about: developing more efficient solar thermal power plants and creating a cheaper and more accessible way to treat water. Concentrated solar plants like the massive Solana facility in Arizona use a parabolic trough system — a large structure that incorporates mirrors to focus the sun’s heat on pipes, heating a synthetic oil that flows to boilers, which create the steam that drives turbines to produce electricity, much like a traditional power plant.
The plants require high intensity sunlight and the mirrors or lenses track the sun from east to west throughout the day. The system is costly and often results in significant heat loss. By contrast, MIT says its approach “generates steam at a solar intensity about 10 times that of a sunny day — the lowest optical concentration reported thus far.” Chen said his team will be working “to continue to improve [their setup] to reduce the tracking or eliminate the tracking.” If successful, the end result would be lower operating costs for the facility and the ability to power the plant with lower concentrations of sunlight.
“Clean energy is always competing against the fossil-based fuels,” he noted, making any cost reductions particularly important.
And it’s not just more efficient solar power generation that has the MIT team motivated to continue their research. “Think about the water treatment, desalination or treating wastewater,” Chen said. “One typical way is to evaporate the water, condense it, of course you need an energy source to do that. In this case, if we can use solar energy, it could produce better technology.”
For large scale desalination plants, energy is the single biggest expense. Desalination plants on average use about 15,000 kilowatt- hours of power for every million gallons of fresh water that’s produced, according to a 2013 report by the Pacific Institute — a reality that has serious implications from a carbon emissions perspective if production of desalinated seawater is increased.
Gripped by severe and prolonged drought, California is currently pouring millions of dollars into desalination plants and last year, the Marshall Islands, home to around 60,000 people, declared a state of national emergency and shipped in desalination plants to fight a severe drought.
Chen said the potential to commercialize their system to use distributed solar energy could could be huge for water treatment in isolated, impoverished areas. In fact, he’s already receiving emails from people around the world excited by the prospect of small-scale technology to produce potable water.
Chen was careful to emphasize, however, that their recent breakthrough in the laboratory was just a first step. There’s a long way to go before their solar ‘sponge’ could be used in water treatment or power generation. Moving forward, the MIT team plans to pressurize the system to figure out the upper bounds of temperature and pressure it can withstand and to take the next steps with water treatment applications by assessing the potential challenges associated with that process.
The post The Crazy New ‘Sponge’ That Can Generate Steam From Sunlight appeared first on ThinkProgress.
After seven years of weird weather thought to be linked to climate change, an unusual phenomenon is unfolding across Britain: a lovely, sultry, old-fashioned summer.
The recent weeks of warm weather punctuated by sharp, thunderous storms are, according to the Met Office, pretty much what should be expected for this time of year and, across the land, nature is taking advantage of a return to order. Meadows are in full flower and abuzz with insects, fruit is abundant and ripening and birds are feasting on the bounty. In the fields, farmers are looking forward to a good harvest.Continue reading...
The Environmental Protection Agency isn’t doing enough to prevent methane from escaping from natural gas pipelines, according to a new report from the agency’s internal watchdog.
The report, published Friday by the EPA’s Inspector General, stated that in 2011, more than $192 million worth of natural gas was lost due to leaks in pipelines. The report said that the agency, which until now has “placed little focus and attention on reducing methane emissions from pipelines in the natural gas distribution center,” needs to take steps to better prevent methane from escaping. It recommended that the EPA work with the Pipelines and Hazardous Materials Safety Administration (PHMSA) to try to fix the problem, a partnership President Barack Obama has also called for.
Up until now, however, the EPA has only implemented a program that encourages natural gas companies to reduce their methane emissions voluntarily, but doesn’t require them to do so. So far, that program hasn’t done enough, the report states.
Methane is a potent greenhouse gas that traps 86 times more heat as CO2 does over a 20-year period. Scientists have warned that methane emissions from the natural gas industry are a significant contributor to climate change, and in 2013, President Obama’s Climate Action Plan stated that “curbing emissions of methane is critical to our overall effort to address global climate change.”
The EPA has agreed to take the Inspector General’s recommendations to partner with PHMSA and create a plan to deal with the financial losses of methane leaks, but it has not yet agreed to other recommendations in the report, including setting performance goals for leak reduction and tracking methane emissions from natural gas pipelines.
The report states that methane leaks typically occur in older pipelines made of cast iron or unprotected steel, which are more prone to cracking and corrosion. Earlier this week, a report from the BlueGreen Alliance recommended that the U.S. replace pipelines every 10 years, rather than every 30, a sped-up timeline that would cut pollution and risk of spills as well as create jobs and increase U.S. GDP.
The Inspector General and BlueGreen Alliance’s reports are the latest of many that warn of major methane emissions from the natural gas sector. Earlier this month, a study found that 40 percent of oil and gas wells in the Marcellus shale region are predicted to fail, causing them to leak methane into the atmosphere and water. Another study from the University of Colorado Boulder in May found methane leaks from oil and gas development in Colorado were three times greater than they had been predicted to be by emissions inventory estimates.
This also isn’t the first time the EPA has been targeted for doing too little to measure or reduce methane emissions. In May, two Cornell Researchers said the EPA is drastically underestimating the potency of methane, and that not enough is being done to reduce methane emissions in the U.S. The White House issued a strategy for methane on March 28, and is expected to decide by later this year whether or not new EPA regulations on methane emissions are necessary.
The post EPA Is Failing To Stop Methane Leaks From Pipelines, Inspector General Says appeared first on ThinkProgress.
Here’s a 2-minute video featuring a parable by Wangari Maathai. Watch, then be a hummingbird by joining us in New York this September.
Courtesy of Dirt! The Movie
Gushing about the benefits of tar sands isn’t easy, or so I realized as I struggled to come up with credulous reasons for why we should extract the toxic sludge while poisoning indigenous communities to burn a carbon intensive fuel source. Even in character as a politician corrupted by the dirty money of the oil lobby—personified by a Tar Sands Monster—it didn’t make sense to me.
But the lunacy of exploiting tar sands was exactly what this street theater intended to display. As part of my fellowship with 350.org, I was tasked with planning a rally at the Conference of New England Governors and Eastern Canadian Premiers. In addition to some behind-the-scenes legwork, I volunteered to put together some fun, visually striking street theater for the event.
New England faces an invasion of Canadian tar sands. Though South Portland recently stopped a tar sands pipeline dead in its tracks, Big Oil continues to search desperately for ways to transport its disaster-prone product. The Tar Sands Free Northeast coalition staged the rally, billed as a People’s Conference, last Sunday to highlight just how far removed the wishes of the oil industry are from the popular reality. Our politicians are beholden to us, the People, and we sent them a message loud and clear: no tar sands in New England—leave it in the ground.
Speakers from the People’s Conference recognized the destruction that tar sands extraction wreaks on the land of indigenous people and once-pristine boreal forests. Tar sands cannot be transported safely by pipeline or rail, and the inevitable spill threatens water supplies, wildlife, and local economies alike. The speakers offered myriad reasons why the tar sands must be stopped. Then came the street theater.
Brushing aside such protestations from the crowd before me, I drank from the oily Kool-Aid that the Tar Sands Monster offered me (with monetary compensation, of course) and sang the praises—however unconvincing—of viscous carbon catastrophe. The longer I spoke, the tighter the monster’s grip over me became, literally, as it tied its tentacles around me. Doing what they do best, the crowd before me organized, taking it upon themselves to cut me from the clutches of the polluting lobby. As I came to my senses and saw the will of the people, the monster withered away. Thus redeemed, I agreed to work with my constituents to ban tar sands and plan infrastructure for a clean energy future.
Our politicians in real life likewise have a chance to listen to the people over the industry and end the tar sands invasion. We’ll make sure they do.
The post Fossil Free Fellow Reflection: Taking Action against Tar Sands in New England appeared first on Fossil Free.
Here’s a message from youth around the world after the failure of the Copenhagen climate negotiations.
We’re not done yet, and that’s why we’re coming to New York this September.
Join us: http://peoplesclimate.org/global/.
Courtesy of Australian Youth Climate Coalition
CREDIT: AP Photos
There are two major-party candidates in the running for West Virginia’s 2nd Congressional District seat in the U.S. House, and both of them think the issue of climate change is best left for other countries to deal with, according to multiple news reports from a candidate forum Thursday.
At the forum, both Democrat Nick Casey and Republican Alex Mooney would not say whether they accept that humans are contributing to global warming. But either way, both candidates also said that it wasn’t for them to decide.
“It’s not our problem,” Casey reportedly said, adding that it was an international issue. “These other people think we’ve got a global problem, let’s see them step up.”
According to the Associated Press, Mooney echoed Casey, saying “there’s no EPA in China” to ensure the country is limiting its greenhouse gas emissions. On the contrary, China has a Ministry of Environmental Protection that works to limit emissions.
Both candidates used their stance on the United States’ responsibility to fight climate change to argue that the Environmental Protection Agency should not be regulating greenhouse gas emissions from coal-fired power plants. If elected, Mooney said he would “fight for legislation to defund and restrict the EPA.”
Casey is certainly correct that, at least in the immediate future, climate change is not solely America’s problem. Indeed, studies show that the countries that will be hit the hardest by climate change are the poorest — East Africa, Burma, Bangladesh and India stand to be impacted greatly by severe and unpredictable weather.
At the same time, it’s primarily America’s emissions that have caused the problems in those countries. The United States is currently world’s second-biggest carbon emitter, but has altogether emitted more greenhouse gases than any other country historically.
And while it is true that countries such as China need to also put in substantial effort for global warming policies to have real impact, America’s position as an economic leader that has contributed the most the carbon emissions necessitates that we make the first move — which is exactly what the EPA’s greenhouse gas regulations on coal plants represent. As President Obama told the New Yorker a few months ago, “It’s very hard for me to get in that conversation [with China on reducing greenhouse gases] if we’re making no effort.”
Climate change is already impacting the United States, too, according to the recently released National Climate Assessment. If emissions aren’t drastically reduced, that report predicts increased drought and wildfires in the West, heavy precipitation and flooding in the East, and more severe weather in the South, among other things.
Mooney and Casey, however, also openly questioned whether climate change was primarily caused by humans, using the increasingly popular excuse that they’re not qualified to know. Mooney said that he didn’t believe that scientists had yet come to a consensus around the issue, but said the debate belongs “in the climate change community.”
Casey reportedly went a bit further on the existence of climate change, saying “something is going on,” but would not go as far as calling it a problem.
“Is it long term or not?” he said, according to the Charleston Daily Mail. “I’ll leave that up to the scientists.”
The tactic of leaving climate change “up to the scientists” has become a way for politicians to avoid taking a stance on global warming. But actual climate scientists have decried this tactic, saying multiple reports have been written by scientists and other experts specifically so that politicians could understand climate change and how it affects the country.
“Personally, I don’t think it proper for any American to use that argument,” Donald J. Wuebbles, a coordinating lead author for the U.N. Intergovernmental Panel on Climate Change’s 2013 assessment report told ThinkProgress in May, adding that climate change should be “readily understood by any policymaker.”
As is stands now, peer-reviewed research shows a 97 percent consensus among scientists that global warming is real and primarily driven by humans.
The post In West Virginia Congressional Race, Both Candidates Think Climate Change Is ‘Not Our Problem’ appeared first on ThinkProgress.
The White House announced today that federal agencies have cut their greenhouse gas emissions 17 percent since 2008 — roughly equivalent to permanently taking 1.8 million cars off the road.
The occasion for the announcement was the annual release of agency scorecards, documenting their progress in cutting GHG emissions, improving energy efficiency, and reducing pollution and waste. This is the fourth year the agencies have released the scorecards. The process was kicked off in 2009 when President Obama issued Executive Order 13514, which laid out the goals for the agencies as well as the process by which they should plan, measure, and document their progress.
The current 2020 goal as laid down by the White House is for the entire government to reduce collective GHG emissions from fuels and building energy use 28 percent from their 2008 levels. Agencies are also tasked with cutting indirect emissions — which come from things like government employee commutes and business travels — by 13 percent.
Federal agencies are also tasked with getting 20 percent of their energy from renewable sources by 2020, a goal that triple government use of renewables from where it was at in 2013. Today’s announcement showed the agencies reached nine percent renewable use by the end of fiscal year 2013. The target for that deadline had been 7.5 percent, putting the agencies ahead of schedule.
The federal government’s use of potable water has also been cut 19 percent from 2007 levels, with a goal of a 26 percent improvement in efficiency of use by 2020.
The White House’s current pledge to the international community is to cut America’s overall carbon emissions 17 percent below their 2005 levels by 2020. The Environmental Protection Agency’s recent rules cutting emissions from new and existing power plants is a major part of that effort, as are a slew of other initiatives outlined in President Obama’s climate action plan, including the improvements for federal agencies.
And while the government’s fuel and electricity usage only accounts for 1.5 percent of the nation’s annual energy consumption, that still makes it the U.S. economy’s single largest source of energy demand — thanks to its operation of nearly 500,000 buildings and over 600,000 vehicles. So the Obama Administration thinks the symbolism is important: “The President firmly believes that the Federal Government should lead by example in improving energy efficiency and cutting harmful carbon pollution,” the White House statement said.
The energy efficiency goals are also significant. America’s national energy consumption has been slowing since the 1970s while economic growth has continued apace, suggesting the economy is getting better at doing more of value with less energy. Global trends among advanced nations are similar. And while more needs to be done, building codes in the U.S. have also improved dramatically in the last few years, putting the government at the forefront of the national trend.
As for water use, it doesn’t take much climate change to drive water scarcity up significantly — though the threat is more severe in other parts of the globe than in North America. Still, U.S. water supplies are already strained by drought and the lack of fresh water is also a threat to the future stability of the country’s energy supply.
As the White House noted, the third National Climate Assessment reported that climate change driven by human GHG emissions is already affecting the country, with more impacts to come in terms of drought, floods, storms, and damaged and strained infrastructure.
Other goals the White House has laid out for federal agencies include: a 30 percent cut in vehicle petroleum use by 2020, 50 percent recycling and waste diversion by 2015, and a net-zero-energy building requirement for 2030.
The post How The U.S. Government Just Pulled Off The Equivalent Of Retiring 1.8 Million Cars appeared first on ThinkProgress.
Former Treasury Secretary Robert Rubin has a must read piece in the Washington Post, “How ignoring climate change could sink the U.S. economy.” The centrist economic panjandrum main point: The notion that tackling climate change will harm the economy is the exact opposite of the truth.
In this regard he makes a similar point to one Climate Progress made last week — one that Sen. Robert F. Kennedy made so powerfully on the presidential campaign trail nearly half a century ago (see below) — the GDP is a deeply flawed measure of the economy’s health.
Rubin is a member of the bipartisan committee that oversaw the recent analysis, “RISKY BUSINESS: The Economic Risks of Climate Change in the United States.” That committee included Republicans like former Secretary of the Treasury (and of State) George P. Shultz, former Sen. Olympia Snowe, and former Bush Treasury Secretary Henry Paulson. Paulson you may recall had an op-ed in the New York Times last month arguing that a carbon tax is needed to help avert “The Coming Climate Crash.”
Rubin’s point is that we need a new GDP “that incorporates the impact of greenhouse gas emissions.” Instead of simply tallying up “the goods and services produced by our economy” we need a GDP that can “account for the present and future damage resulting from the emissions involved in producing those goods and services.” His bottom line:
We do not face a choice between protecting our environment or protecting our economy. We face a choice between protecting our economy by protecting our environment — or allowing environmental havoc to create economic havoc. And a major step toward changing the debate is to change the way we measure the health of our economy, our fiscal conditions, and the health of individual companies and businesses to better reflect the world as it will be.
Several months before he was assassinated, Robert F. Kennedy also challenged our monomaniacal pursuit of GDP in “one of the most beautiful of his speeches,” as Obama described it an August 2008 New York Times profile.
Here is the transcript:
We will find neither national purpose nor personal satisfaction in a mere continuation of economic progress, in an endless amassing of worldly goods. We cannot measure national spirit by the Dow Jones Average, nor national achievement by the Gross National Product. For the Gross National Product includes air pollution, and ambulances to clear our highways from carnage. It counts special locks for our doors and jails for the people who break them. The Gross National Product includes the destruction of the redwoods and the death of Lake Superior. It grows with the production of napalm and missiles and nuclear warheads…. It includes… the broadcasting of television programs which glorify violence to sell goods to our children.
And if the Gross National Product includes all this, there is much that it does not comprehend. It does not allow for the health of our families, the quality of their education, or the joy of their play. It is indifferent to the decency of our factories and the safety of our streets alike. It does not include the beauty of our poetry, or the strength of our marriages, the intelligence of our public debate or the integrity of our public officials… The Gross National Product measures neither our wit nor our courage, neither our wisdom nor our learning, neither our compassion nor our devotion to our country. It measures everything, in short, except that which makes life worthwhile, and it can tell us everything about America — except whether we are proud to be Americans.
And this was a campaign speech! RFK believed that this was politically winning viewpoint.
The Bottom Line: Unsustainable pursuit of short-term “wealth” at the expense of sustainable prosperity — growth for the sake of growth — is both the cause of our recent economic collapse and the fatal misconception at the heart of our “Ponzi scheme” global economy.
The post Robert Rubin Echoes Robert F. Kennedy: GDP Is Fatally Flawed Measure Of Economic Health appeared first on ThinkProgress.
CREDIT: AP Photo/Matt York
Phoenix set a record high temperature of 115°F at 1:32p.m. on Thursday afternoon. Then, 43 minutes later, it set another as the temperature gauge at Sky Harbor International crept up again to 116.
Yuma, Arizona tied its record high of 117 for this date, and nearby Tacna hit 120.
Arizona hasn’t just been suffering high maximum temperatures — it’s the high minimum temperatures too. Thursday set a record high minimum temperature of 93, up from the previous record of 90 set back in 2006. “We have not dropped below the 90 degree mark since Tuesday morning, if you can believe that,” said Dr. Matt Pace of Phoenix’s NBC 12 News.
The Salt River Project, one of Arizona’s largest utilities, reported that on Wednesday it saw record demand from its Phoenix area customers, causing it to deliver the most power to them it ever has — 6,707 megawatts.
Phoenix gets hotter than some more rural areas because of the urban heat island effect, which causes the dense and smooth structures in a city (think cement) to absorb more heat than natural landscape does, and allow for more convection and less turbulence than rougher rural areas do. Some in the city are turning to cool roofs and vegetation to cut this effect down a little.
When hikers take to the trails without adequate hydration, they need to be rescued, something that is happening more and more. Last year there were 153 mountain rescues for stranded hikers in total. Just through the first six and a half months of this year, firefighters have reported 133 such rescues, and two deaths. So many of them are caused by extreme temperatures that the Phoenix Fire Department asked residents to just stay indoors between 8 a.m. and 10 p.m. through Thursday evening.
“More people die from heat than any other weather event,” Dr. Bob England, director of the Maricopa County Department of Public Health, told the Arizona Republic.
The Salvation Army set up nine different hydration stations with ice and water across the valley, and local shelters told CBS 5 that it anticipated a spike in demand for shelters — hundreds more people than normal.
High school football games begin in Arizona before September, which means that practices start next week for many teams — Week Zero. Several Phoenix-area teams, rather than worrying about the constant hydration, burns, and other health problems brought by extreme heat, have just left town, going up into the mountains, or leaving the state for cooler temperatures.
It’s not just people that suffer in the extreme heat — zoo animals that are not used to the Arizona summers are kept alive and comfortable using some fairly unorthodox means. Sumatran tigers veg out on a a frozen block of blood and fish. Elephants and other animals get sprayed with water. Lions get special cooling pads with cold water flowing through tubes underneath it. Last month, sherriff’s deputies found 20 dead dogs in a kennel that had lost its air conditioning.
Local plant nurseries sell shade cloth, which gardeners drape over plants that can’t take the extreme heat and direct sunlight.
A volunteer fire department in Cashion, Oklahoma fought a 10-hour blaze on Wednesday and with a heat index of over 100 degrees, they depleted their drinking water supply. Looking at a weekend of over a hundred degree temperatures, the department is asking for donations of water and Gatorade. Standards in California require that a quart of water be available for firefighters every hour — in hot, dry climates this need could increase to three quarts an hour, or three gallons a day.
Arizona, like much of the Southwest, can expect extreme weather and temperatures to increase as humans continue to release more heat-trapping gases into the atmosphere — but one of the main impacts, according to the recent National Climate Assessment, will be drought. And there is not much margin for error as the population grows: a new NASA study found that the Colorado River basin has lost enough groundwater to fill two Lake Meads.
The post Phoenix Sets Temperature Records As Arizona Gets Punished By Extreme Heat Wave appeared first on ThinkProgress.
CREDIT: AP Photo/Marcio Jose Sanchez
Since 2004 the Colorado River basin — which provides water for seven states — has lost enough water to fill Lake Mead twice.
That’s the word from a new study by researchers at NASA and the University of California, who used satellite data to do a first-ever quantifiable measure of how much groundwater people in the American west and southwest have used up in the current spate of droughts. According to the Wall Street Journal, the team determined that from 2004 to 2013 the basin lost 17 trillion gallons of water, which is enough to supply 50 million homes for a year. Three-fourths of that loss was groundwater, and the fastest rates of depletion occurred in 2013 — following one of the driest years on record.
The Colorado River basin supplies the water for about 40 million people and four million acres of farmland across California, Arizona, Colorado, New Mexico, Nevada, Utah and Wyoming.
The study employed a pair of satellites launched by the GRACE mission in 2002, and translated their data on the fluctuations in the Earth’s gravitational pull into changes in total water storage. Separate data sets disaggregated factors like snowpack, soil moisture, surface water, and groundwater. One resulting drawback is that the data can’t get especially granular, nor can it tease apart whether water declines resulted from increased pumping or from lower recharge rates in the basin itself — though the former would go up and the latter would go down during a drought.
The system, described by Jay Famiglietti, one of the study co-authors, as “scales in the sky,” has already been used to measure groundwater depletion in California specifically as well as the Middle East. Just to be sure, the researchers also checked their conclusions against measurements taken from 74 individual wells throughout the surveyed area. The trends in the wells matched the trend in the satellite data.
“That gives us confidence in what GRACE is seeing,” said Stephanie Castle, a researcher at the University of California, Irvine, and the study’s lead author.
“We didn’t think it would be this bad,” Castle continued. “Basin-wide groundwater losses are not well documented. The number was shocking.”
Lake Powell and Lake Mead, both parts of the Colorado River basin, serve as some of the largest reservoirs for that area of the country, and officials are increasingly alarmed by water shortages in both lakes. Between the drought and the demand of rising populations, Lake Mead recently hit its lowest water level ever.
California especially has been punished by droughts in recent years, according to the U.S. Drought Monitor, with every last inch of the state covered by “moderate” or “exceptional” drought in April. More than 80 percent of the state is now in extreme drought, as defined by the U.S. Drought Monitor, and things are anticipated to stay that way at least through October. And work released last week estimated the economic damage from the drought this year at $2.2 billion in losses for the California agricultural industry, along with 17,000 jobs gone.
The New York Times just reported that about 34 percent of the lower 48 states have been in moderate drought (as defined by the Drought Monitor) or worse as of July 22. And while the Drought Monitor data only goes back to 2000, the Palmer Index goes back over a century, revealing the current drought is on par with the epic droughts of the 30s and 50s.
Not surprisingly, residents in the most drought stricken states also tend to be the ones who consume the most water, which often means pulling from groundwater reservoirs. Per capita usage is highest in places like California, Arizona, Nevada and New Mexico, and much of the excess usage goes to watering plants, lawns, and landscapes. As much as half of that water is in turn wasted, as it evaporates or runs off thanks to inefficient irrigation methods.
California is also instituting mandatory water restrictions for the first time. The rules would ban wasteful outdoor watering, hosing down sidewalks and driveways, and will require a shut-off nozzle for hoses. Maximum penalties could reach up to $500, enforceable by any public employee empowered to enforce laws, including local water agencies. Work by the Natural Resources Defense Council has suggested that the right combination of water efficiency and conservation methods could close California’s current gap between its water usage and water supplies with room to spare.
Studies have also linked the droughts to climate change, as warmer temperatures alter weather systems to bring precipitation in wider circles around much of the American west. Climate change also brings precipitation in shorter, bigger bursts, and more heat in turn speeds up evaporation and prevents the accumulation of water resources.
The post The Colorado River Basin Has Lost Enough Water To Fill Lake Mead Twice Over appeared first on ThinkProgress.
“Smaller is baller,” “Min it to win it,” “Think shrink.”
Those are the puns Google is using to promote its new competition: $1 million to whomever can invent a working power inverter for solar and other forms of renewable energy that’s roughly as small as a laptop. The company has teamed up with the Institute of Electrical and Electronics Engineers to launch the competition.
For background, a power inverter is a box that takes alternating current (AC) out of a direct current (DC) — such as a solar array — and applies that AC to run things in our homes, offices, and businesses. In order to use the DC power generated from wind turbines and solar panels, it must be converted into alternating current, via a power inverter.
“Much of our domestic world relies on AC,” the tech blog Venturebeat explains. “And AC needs robust power distribution grids, huge power plants and either fossil fuel, nuclear power, or falling water.”
As they are currently, inverters for solar and wind are, according to Google, too big — “roughly the size of a picnic cooler,” the company said on its new website for the “Littlebox Challenge.” But if they can be shrunk down to the size of a tablet or laptop, the company says, it would “enable more solar-powered homes, more efficient distributed electrical grids, and could help bring electricity to the most remote parts of the planet.”
“There will be obstacles to overcome; like the conventional wisdom of engineering”, Google Green’s Eric Raymond said in a statement announcing the competition. “But whoever gets it done will help change the future of electricity.”
The challenge for engineers is to design and build a 1-kilowatt-minimum power inverter with a power density of at least 50 watts per cubic inch — not an insane amount of power, but enough to fuel some lights and a box fan. Applicants must register by September and submit their ideas by July 15 of next year. Eighteen finalists will be chosen, and a grand prize winner will be announced sometime in January 2016, the company said.
Google has long-promoted increased use of renewable energy, but its relationship with it has been a bit of a puzzle. The company’s Google Green subsidiary has invested millions in clean energy projects throughout the last decade, with an eventual goal to power its data centers with 100 percent renewable energy. The company claims that, with purchased offsets, its carbon footprint is zero, meaning it does not contribute to climate change. So far, Google has sunk more than $1 billion into wind and solar projects that in total generate more than 2 gigawatts of power.
At the same time, however, Google has hosted fundraisers to benefit one of the U.S. Senate’s most vehement climate deniers, and is a member of the American Legislative Exchange Council (ALEC) — a corporate lobbyist group that actively works to thwart statewide renewable energy programs.
The two-way relationship has raised questions about the company’s intentions among some environmentalists, who are asking Google to cut ties with the anti-climate group. But one possible explanation for the company’s partnership with interests purportedly contrary to their own is that it’s necessary to help sway them to their side, as Tim Worstall in Forbes notes.
“Sadly, the way that the modern economy works is that government, at all levels, has a great deal of influence over how business works,” he writes. “So, it is necessary for a large business to flash the cash around to both sides, to join lobby groups from all sides of the political compass.”
The post If You Can Make Solar Power Better, Google Will Give You $1 Million appeared first on ThinkProgress.
CREDIT: A.P. Images
The governors of coal-dependent states like Texas, Louisiana, and Oklahoma have been among the most outspoken in their objections to the Environmental Protection Agency’s proposed carbon rules for existing power plants. Numerous polls have shown that these governors are seriously out of step with how their constituents feel, but new research now shows that they may also be out of touch with what’s best for the economy of their states.
A study conducted by the Rhodium Group and the Center For Strategic and International Studies, released Thursday, concludes that many of the states protesting the new rules the most also stand to benefit the most from the anticipated shift away from coal and toward natural gas.
According to Trevor Houser, an analyst at the Rhodium Group and a co-author of the study, that’s because many of these states are leaders in natural gas production, which is expected to replace coal as a power source as states strive to meet the nationwide 30 percent reduction in carbon emissions by 2030 proposed by the EPA.
The study found that the predicted shift from coal to natural gas has the potential to increase total U.S. gas consumption by up to 40 percent between 2020 and 2030, compared to the projected increase without the new regulations. Furthermore, 90 percent of that increased demand for natural gas is expected to be met through U.S. production. On the flip side, there will be up to a 47 percent reduction in coal demand.
The transition is predicted to translate into a 20 percent increase in the amount of money U.S. natural gas producers will bring in.
Houser points out that a lot of attention has been given to how energy prices will change in the country, but very little has been said on what happens upstream.
“Total nationwide energy expenditures will increase by $3 billion per year on average,” said Houser during an event launching the report. “But you have an order of magnitude larger than that change in production revenue when you shift from coal to gas.”
Combined, Louisiana, Texas, Arkansas, and Oklahoma stand to see their revenue from natural gas production soar by $18 billion. While these states will also take a $1 billion loss in revenue as coal demand falls, their total fossil fuel revenue will still go up by $17 billion.
“These states have some of the largest compliance obligations, nearly twice that of Midwest or New England states,” said Houser. “But they will also see the largest benefits from increased natural gas production.”
Despite this, Texas governor Rick Perry led a group of nine governors, which included Louisiana governor Bobby Jindal in a protest letter to President Obama regarding the rule, citing the devastating economic effects of the EPA’s proposed rule.
Earlier this month, nine states including Oklahoma, joined a lawsuit against the EPA’s proposed carbon rules, initially brought by coal company owner and climate denier, Robert Murray.
The post Governors Who Hate EPA’s Carbon Regs: Your States May Benefit The Most appeared first on ThinkProgress.