Skip to content

Improved Methods Save Money in Future Borehole Thermal Energy Storage Design

Globally, the gap between the energy production and consumption is growing wider. To promote sustainability, University of California San Diego PhD candidate and ASCE GI Sustainability in Geotechnical Engineering committee member, Tugce Baser, Dr. John McCartney, Associate Professor, and their research team, Dr. Ning Lu, Professor at Colorado School of Mines and Dr. Yi Dong, Postdoctoral Researcher at Colorado School of Mines, are working on improving methods for borehole thermal energy storage (BTES), a system which stores solar heat in the soil during the summer months for reuse in homes during the winter. Baser says, “We are running out of finite energy resources. We need to come up with new strategies to use free and renewable energy resources such as solar energy for a sustainable future.”

borehole thermal energy storage

Baser’s BTES design.

How it works

BTES systems are an approach to provide efficient renewable resource-based thermal energy to heat buildings. They are configured to store thermal energy collected from solar thermal panels during the summer and discharge the heat to buildings during the winter. They function by circulating a fluid within a closed-loop pipe network installed in vertical boreholes to inject heat collected from solar thermal panels. During winter, cold fluid is circulated through the heat exchangers to recover the heat from the subsurface and distribute it to the buildings. Baser explains, “The subsurface provides an excellent medium to store this heat due to the relatively lower thermal conductivity and lower specific heat capacity especially when the soil layer is in the vadose zone. Lower thermal properties allow us to concentrate the heat in a specific array and the heat losses to the environment are potentially low. These systems typically include an insulation layer and a hydraulic barrier near the ground surface to reduce heat and vapor losses to the atmosphere.”

borehole thermal energy storage

BTES construction.

Why do we need improved methods?

Baser and her team are trying to improve the understanding of heat storage mechanisms and evaluate changes in the rate of heat transfer and heat storage in the vadose zone where the soil is unsaturated. The goal of the project is improve conventional methods by generating models to fit different soil types and situations.  She says, “The European community introduced us to the borehole thermal energy storage systems to provide heat specifically for domestic use, but there is still a chance for us to design them more efficiently by having a full understanding of the thermal response of these systems that is specific to the ground material and subsurface conditions. The primary objective of this research is to understand the mechanisms of coupled heat transfer and water flow in unsaturated soil profiles during the heat injection and subsequent heat extraction into these different arrays and different dimensions of borehole heat exchangers.”

borehole thermal energy storage

Solar panels.

Baser and her team working on designing numerical models based on finite element method which improve some of the numerical models in the literature used to characterize the thermal response of the systems. The new models add new considerations, such as the heat pipe effect in different soil types. Baser explains, “Because thermal and hydraulic properties of soils are highly coupled and are specific to soils, the thermal response of a BTES system will be different when it is installed in different types of soils. For example, you see the heat pipe effect where there is evaporation and subsequent condensation in fine grained soils rather than coarse soils because in coarse grain soils the pore characteristics are different. The duration of the heat pipe effect (or convective cycle) is longer in fine grain soils. We conclude that considering coupled heat transfer and water flow in the thermal response of Borehole Thermal Energy Storage system is important.”

borehole thermal energy storage

In-ground heat exchanger

Experiments in the field and in the lab help verify the new models

To fully understand heat transfer mechanisms and water flow in unsaturated soils, the research team installed two different SBTS systems at different scales, one in Golden, Colorado School of Mines campus, and the other at the UC San Diego research campus.  Baser says, “The subsurface characteristics of both sites are different, and this gives us the opportunity to investigate the impact of the different soil layers on the thermal response experimentally in a full scale. In addition, the scales of each Borehole Thermal Energy Storage system are different, and we also apply different heat injection rates. We have used these data to further validate our coupled heat transfer and water flow model so that we can use it for design purposes.”

borehole thermal energy storage

Soil moisture sensor locations.

Baser started with laboratory heating experiments, in which soil in a large tank is heated by heat exchangers. She installed soil moisture sensors to measure volumetric water content and the temperature and then used the KD2 pro thermal property analyzer to monitor thermal properties during heating experiments to characterize the coupled thermo-hydraulic relationships. For the field experiments the team uses soil moisture sensors equipped with temperature sensors and the KD2 pro to monitor subsurface temperature fluctuation because during the summertime the air temperature is higher, thus ambient air temperature fluctuation and penetration may become significant.

Baser also uses thermistor strings that include six thermistors at different depths and thermistor pipe plugs, voltage input modules, and flow meters.  She says, “Thermistor pipe plugs and flow meters are used in the manifold to monitor the inlet and outlet fluid temperatures and flow rates in each loop to calculate heat transfer rate into the ground. Flow meters were installed to control flow in each loop because you don’t want to over or underload the borehole loops. The amount of energy that you collect from the solar loop and the amount of energy that you inject into the ground can be used to define the efficiency of the system.” Baser says thermistor strings help monitor the ground temperature during the summer heat loading at different depths. They’re also used to monitor borehole wall temperature over time. The team installed one thermistor string 9 meters away from the heat storage array to see if far field is affected by the heat transfer within the array.

borehole thermal energy storage

Insulation prevents heat loss to the environment.

The new models will save money in future Borehole Thermal Energy Storage design

Baser says building numerical models and solving them was very complicated and time consuming, but they’ve had good results. She explains, “We’ve recently proved, both experimentally and numerically, that considering coupled thermal and hydraulic relationships are very important for thermal response analysis. Thus, our recommendation is that it’s fine to use the analytical models and user-friendly numerical models that consider constant thermal properties in the design analyses for saturated soils. However, in unsaturated soils, there is a very high possibility that the contribution of heat transfer evaporation and condensation would be missing and the Borehole Thermal Energy Storage system would be oversized, costing a significant amount of money. When dealing with soils in the vadose zone, coupled thermo-hydraulic constitutive relationships in the modeling efforts need to be considered.”

You can learn more about Tugce Baser’s research here.

Get more information on applied environmental research in our

 

Soil Moisture Sensors: Why TDR vs. Capacitance May Be Missing the Point

Time Domain Reflectometry (TDR) vs. capacitance is a common question for scientists who want to measure volumetric water content (VWC) of soil, but is it the right question?  Dr. Colin S. Campbell, soil scientist, explains some of the history and technology behind TDR vs. capacitance and the most important questions scientists need to ask before investing in a sensor system.

TDR vs. Capacitance

TDR began as a technology the power industry used to determine the distance to a break in broken power lines.

Clarke Topp

In the late 1970s, Clarke Topp and two colleagues began working with a technology the power industry used to determine the distance to a break in broken power lines.  Time Domain Reflectometers (TDR) generated a voltage pulse which traveled down a cable, reflected from the end, and returned to the transmitter. The time required for the pulse to travel to the end of the cable directed repair crews to the correct trouble spot. The travel time depended on the distance to the break where the voltage was reflected, but also on the dielectric constant of the cable environment.  Topp realized that water has a high dielectric constant (80) compared to soil minerals (4) and air (1).  If bare conductors were buried in soil and the travel time measured with the TDR, he could determine the dielectric constant of the soil, and from that, its water content.  He was thus able to correlate the time it took for an electromagnetic pulse to travel the length of steel sensor rods inserted into the soil to volumetric water content. Despite his colleagues’ skepticism, he proved that the measurement was consistent for several soil types.

TDR vs. Capacitance

TDR sensors consume a lot of power. They may require solar panels and larger batteries for permanent installations.

TDR Technology is Accurate, but Costly

In the years since Topp et al.’s (1980) seminal paper, TDR probes have proven to be accurate for measuring water content in many soils. So why doesn’t everyone use them? The main reason is that these systems are expensive, limiting the number of measurements that can be made across a field. In addition, TDR systems can be complex, and setting them up and maintaining them can be difficult.  Finally, TDR sensors consume a lot of power.  They may require solar panels and larger batteries for permanent installations. Still TDR has great qualities that make these types of sensors a good choice.  For one thing, the reading is almost independent of electrical conductivity (EC) until the soil becomes salty enough to absorb the reflection.  For another, the probes themselves contain no electronics and are therefore good for long-term monitoring installations since the electronics are not buried and can be accessed for servicing, as needed.  Probes can be multiplexed, so several relatively inexpensive probes can be read by one set of expensive electronics, reducing cost for installations requiring multiple probes.

Many modern capacitance sensors use high frequencies to minimize effects of soil salinity on readings.

Advances in Electronics Enable Capacitance Technology

Dielectric constant of soil can also be measured by making the soil the dielectric in a capacitor.  One could use parallel plates, as in a conventional capacitor, but the measurement can also be made in the fringe field around steel sensor rods, similar to those used for TDR.  The fact that capacitance of soil varies with water content was known well before Topp and colleagues did their experiments with TDR.  So, why did the first attempt at capacitance technology fail, while TDR technology succeeded? It all comes down to the frequency at which the measurements are made.  The voltage pulse used for TDR has a very fast rise time.  It contains a range of frequencies, but the main ones are around 500 MHz to 1 GHz.  At this high frequency, the salinity of the soil does not affect the measurement in soils capable of growing most plants.  

Like TDR, capacitance sensors use a voltage source to produce an electromagnetic field between metal electrodes (usually stainless steel), but instead of a pulse traveling down the rods, positive and negative charges are briefly applied to them. The charge stored is measured and related to volumetric water content. Scientists soon realized that how quickly the electromagnetic field was charged and discharged was critical to success.  Low frequencies led to large soil salinity effects on the readings.  This new understanding, combined with advances in the speed of electronics, meant the original capacitance approach could be resurrected. Many modern capacitance sensors use high frequencies to minimize effects of soil salinity on readings.  

TDR vs. Capacitance

NASA used capacitance technology to measure water content on Mars.

Capacitance Today is Highly Accurate

With this frequency increase, most capacitance sensors available on the market show good accuracy. In addition, the circuitry in them can be designed to resolve extremely small changes in volumetric water content, so much so, that NASA used capacitance technology to measure water content on Mars. Capacitance sensors are lower cost because they don’t require a lot of circuitry, allowing more measurements per dollar. Like TDR, capacitance sensors are reasonably easy to install. The measurement prongs tend to be shorter than TDR probes so they can be less difficult to insert into a hole. Capacitance sensors also tend to have lower energy requirements and may last for years in the field powered by a small battery pack in a data logger.   

In two weeks: Learn about challenges facing both types of technology and why the question of TDR vs. Capacitance may not be the right question.

Get more information on applied environmental research in our

Where Will the Next Generation of Scientists Come From?

The Global Learning and Observations to Benefit the Environment (GLOBE) Program is an international science and education program that provides students and the public worldwide with the opportunity to participate in data collection and the scientific process.

GLOBE

GLOBE has a huge impact in schools around the world.

Its mission is to promote the teaching and learning of science, enhance community environmental literacy and stewardship, and provide research quality environmental observations.  The GLOBE program works closely with agencies such as NASA to do projects like validation of SMAP data and the Urban Heat Island/Surface Temperature Student Research Campaign.  The figure below shows the impact GLOBE is having in schools worldwide.

Dixon Butler, former GLOBE Chief Scientist, is excited about the recent African project GLOBE is now participating in called the TAHMO project.  He says, “Right now, in Kenya and Nigeria, GLOBE schools are putting in over 100 new  mini-weather stations to collect weather data, and all that usable data will flow into the GLOBE database.”

GLOBE

Participating in real science at a young age gets youth more ready to be logical, reasoning adults.

Why Use Kids to Collect Data?

Dixon says kids do a pretty good job taking research quality environmental measurements.  Working with agencies like NASA gets them excited about science, and participating in real science at a young age gets them more ready to be logical, reasoning adults.  He explains, “The 21st century requires a scientifically literate citizenry equipped to make well-reasoned choices about the complex and rapidly changing world. The path to acquiring this type of literacy goes beyond memorizing scientific facts and conducting previously documented laboratory experiments to acquiring scientific habits of mind through doing hands-on, observational science.”

Dixon says when GLOBE started, the plan was to have the kids measure temperature.  But one science teacher, Barry Rock, who had third grade students using Landsat images to do ozone damage observations, called the White House and said, “Kids can do a lot more than measure temperature.” He gave a presentation at the White House where he showed a video of two third grade girls looking at Landsat imagery. They were discussing their tree data, and at one point, one said to the other, ‘That’s in the visible. Let’s look at it in the false color infrared.’  At that point, Barry became the first chief scientist of GLOBE, and he helped set up the science and the protocols that got the program started.

GLOBE

GLOBE uses online and in-person training and protocols to be sure the students’ data is research quality.

Can GLOBE Data be Used by Scientists?

GLOBE uses online and in-person training and protocols to be sure the students’ data is research quality.  Dixon explains, “There was a concern that these data be credible, so the idea was to create an intellectual chain of custody where scientists would write the protocols in partnership with an educator so they would be written in an educationally appropriate way.  Then the teachers would be trained on those protocols. The whole purpose is to be sure scientists have confidence that the data being collected by GLOBE is useable in research.”

Today GLOBE puts out a Teacher’s’ Guide and the protocols have increased from 17 to 56.  The soil area went from just a temperature and moisture measurement to a full characterization.  Dixon says, “We’ve been trying to improve it ever since, and I think we’re getting pretty good at it.”  

GLOBE

GLOBE students were the only ones going around looking up at the sky doing visual categorization of clouds and counting contrails. It was just no longer being done, except by these students.

What About the Skeptics?

If you ask Dixon how he deals with skeptics of the data collected by the kids, he says, “I tell them to take a scientific approach.  Check out the data, and see if they’re good.  One year, a GLOBE investigator found a systematic error In U-tube maximum/minimum thermometers mounted vertically, which had been in use for over a century, that no one else found. The GLOBE data were good enough to look at and find the problem.  There are things the data are good for and things they’re not good for. Initially, we wanted these data to be used by scientists in the literature, and there have been close to a dozen papers, but I would argue that GLOBE hasn’t yet gotten to the critical mass of data that would make that easier.”

GLOBE did have enough cloud data, however, to be used in an important analysis of geostationary cloud data where the scientist compared GLOBE student data with satellite data Dixon adds, “GLOBE students were the only ones going around looking up at the sky doing visual categorization of clouds and counting contrails. It was just no longer being done, except by GlOBE students. Now GLOBE has developed the GLOBE Observer app that let’s everyone take and report cloud observations.”

GLOBE

Young minds need to experience the scientific approach of developing hypotheses, taking careful, reproducible measurements, and reasoning with data.

What’s the Future of GLOBE?

Dixon says GLOBE’s goal is to raise the next generation of intelligent constituents in the body politic. He says, “I thought about this a lot when I worked for the US Congress.  In addition to working with GLOBE, I now have a non-profit grant-making organization called YLACES with the objective of helping kids to learn science by doing science.  Young minds need to experience the scientific approach of developing hypotheses, taking careful, reproducible measurements, and reasoning with data. Inquiries should begin early and grow in quality and sophistication as learners progress in literacy, numeracy, and understanding scientific concepts. In addition to fostering critical thinking skills, active engagement in scientific research at an early age also builds skills in mathematics and communications. These kids will grow up knowing how to think scientifically. They’ll ask better questions, and they’ll be harder to fool.   I think that’s what the world needs, and I see the environment and science as the easiest path to get there.”

Learn more about GLOBE and its database here and about YLACES at www.ylaces.org.

Get more information on applied environmental research in our

New Weather Station Technology in Africa-3

The Trans African Hydro and Meteorological Observatory (TAHMO) project expects to put 20,000 microenvironment monitors over Africa in order to understand the weather patterns which affect that continent, its water, and its agriculture. In the conclusion of our 3 part series, we  interview Dr. John Selker about his thoughts on the project.

TAHMO

The economics of weather data value may be going up because we’re reaching a cusp in terms of humanity’s consumption of food.

In your TEDx talk you estimate that US weather stations directly bring U.S. consumers  31 billion dollars in value per year. Can Africa see that same kind of return?

Even more.  The economics of weather data value may be going up because we’re reaching a cusp in terms of humanity’s consumption of food.  Africa, one could argue, is the breadbasket for this coming century.  Thus, the value of information about where we could grow what food could be astronomical.  It’s very difficult to estimate.  One application of weather data is crop insurance.  Right now, crop insurance is taking off across Africa. The company we’re working with has 180,000 clients just in Kenya.  When we talked about 31 billion dollars in the U.S., that is the value citizens report, but you need to add to that protection against floods, increased food production, water supply management, crop insurance and a myriad of other basic uses for weather data.  In Africa, the value of this type of protection alone pays for over 1,000 times the cost of the weather stations.

Another application for weather data is that in Africa, the valuation of land itself is uncertain. So if, because of weather station data, we find that a particular microclimate is highly valuable, suddenly land goes from having essentially no value to becoming worth thousands of dollars per acre.  It’s really difficult to estimate the impact the data will have, but it could very well end up being worth trillions of dollars.  We have seen this pattern take place in central Chile, where land went from about $200/hectare in 1998 to over $3,000/ha now due to the understanding that it was exceptionally suited to growing pine trees, which represented a change in land value exceeding $3 billion.

Does the effect of these weather stations go beyond Africa?

There’s limited  water falling on the earth, and if you can’t use weather data to invest in the right seeds, the right fertilizer, and plant at the right time in the right place, you’re not getting the benefit you should from having tilled the soil.  So for Africa the opportunity to improve yields with these new data is phenomenal.  

In terms of the world, the global market for calories is now here, so if we can generate more food production in Africa, that’s going to affect the price and availability of food around the world.  The world is one food community at this point, so an entire continent having inefficient production and ineffective structures costs us all.

TAHMO

If we can generate more food production in Africa, that’s going to affect the price and availability of food around the world.

You’re collecting data from Africa. Is it time to celebrate yet?

I think this is going to be one of those projects where we are always chilling the champagne and never quite drinking it.  It is such a huge scope trying to work across a continent.  So I would say we’ve got some stations all over Africa, we’re learning a lot, and we’ve got collaborators who are excited.  We have reason to feel optimistic.  It will be another five years before I’ll believe that we have a datastream that is monumental.  Right now we’re still getting the groundwork taken care of.  By September of this year we expect to have five hundred of stations in place, and then two years from now, over two thousand. This will be a level of observation that will transform the understanding of African weather and climate.

TAHMO

This is a project of hundreds of people across the world putting their hands and hearts in to make this possible.

How do you deal with the long wait for results?  

In science there is that sense you get when you want to know something, and you can see how to get there.  You have a theory, and you want to prove it.  It kind of captures your imagination.  It’s a combination of curiosity and the potential to actually see something happen in the world: to go from a place where you didn’t know what was going on to a place where you do know what’s going on.  I think about Linus Pauling, who made the early discoveries about the double helix.  He had in his pocket the X-ray crystallography data to show that the protein of life was in helical form, and he said, “In my pocket, I have what’s going to change the world.”  When we realized the feasibility of TAHMO, we felt much the same way.”  

Sometimes in your mind, you can see that path: how you might change the world.  It may never be as dramatic as what Pauling did, but even a small contribution has that same excitement of wanting to be someone who added to the conversation, who added to our ability to live more gracefully in the world.  It’s that feeling that carries you along, because in most of these projects you have an idea, and then ten years later you say, “why was it that hard?”  

Things are usually much harder than your original conception, and that energy and curiosity really helps you through some of the low points in your projects.  So, curiosity has a huge influence on scientific progress.  Changing the world is always difficult, but the excitement, curiosity, and working with people, it all fits together to help us draw through the tough slogs.  In TAHMO, I cannot count the number of people who have urged us to keep the effort moving forward and given a lift just when we needed it most.  This is a project of hundreds of people across the world putting their hands and hearts in to make this possible.  Having these TAHMO supporters is an awesome responsibility and concrete proof of the generosity and optimism of the human spirit.

Learn how you can help TAHMO.

New Weather Station Technology in Africa (Part 2)

Weather data improve the lives of many people. But, there are still parts of the globe, such as Africa, where weather monitoring doesn’t exist (see part 1). John Selker and his partners intend to remedy the problem through the Trans African Hydro Meteorological Observatory (TAHMO).  Below are some challenges they face.

weather station

TAHMO aims to deploy 20,000 weather stations across the continent of Africa in order to fill a hole that exists in global climate data.

Big Data, Big Governments, and Big Unknowns

Going from an absence of data to the goal of 20,000 weather stations offers hope for positive changes. However, Selker is still cautious. “Unintended consequences are richly expressed in the history of Africa, and we worry about that a lot. It’s an interesting socio-technical problem.”  This is why Selker and others at TAHMO are asking how they can bring this technology to Africa in a way that fits with their cultures, independence, and the autonomy they want to maintain. 

TAHMO works with the government in each country stations are deployed in; negotiating agreements and making sure the desires of each recipient country are met. Even with agreements in place, the officials in each country will do what is in the best interest of the people: a gamble in countries where corruption is a factor which must be addressed. Selker illustrates this point by recalling an instance in 1985 when he witnessed a corrupt government official take an African farmer’s land because the value had increased due to a farm-scale water development project.

Most TAHMO weather stations are hosted and maintained by a local school, making it available as an education tool for teachers to use to teach about climate and weather. Data from TAHMO are freely available to the government in the country where the weather station is hosted, researchers who directly request data, and to the school hosting and maintaining the weather station. Commercial organizations will be able to purchase the data, and the profits will be used to maintain and expand the infrastructure of TAHMO.

weather station

Selker says it’s all about collaboration.

Terrorism, Data, and Open Doors

“When I wanted to go out and put in weather stations, my wife simply said, ‘no, you will not go to Chad.’ … because it is Boko Haram central,” Selker says.

The Boko Haram— a terrorist organization that has pledged allegiance to ISIS— creates an uncommon hurdle. Currently the Boko Haram is most active in Nigeria, but has made attacks in Chad, Cameroon, and Niger.

Selker also mentioned similar issues with ISIS, “When ISIS came through Mali, the first thing they did is destroy all the weather stations. So they have no weather data right now in Mali.” Acknowledging the need for security, he adds, “we’re  completing the installation of  eight stations [in Mali] in April.”

“We have good contacts [in Nigeria] and they’re working hard to get permission to put up stations right now in that area. We’ve shipped 15 stations which are ready to install. With these areas we can’t go visit, it’s all about collaboration. It’s about partners and people you know. We have a partnership with a tremendous group of Africans who are really the leading edge of this whole thing.”

weather station

Most TAHMO weather stations are hosted and maintained by a local school.

A Hopeful Future

Despite the challenges of getting this large-scale research network off the ground, Selker and his group remain hopeful.  About his weather data he says, “It’s not glamorous stuff, you won’t see it on the cover of magazines, but these are the underpinnings of a successful society.”

Selker optimistically adds, “We are in a time of incredible opportunity.”

Learn more about TAHMO

Next Week:  Read an interview with Dr. John Selker on his thoughts about TAHMO.

New Weather Station Technology in Africa

Weather data, used for flight safety, disaster relief, crop and property insurance, and emergency services, contributes over $30 billion in direct value to U.S. consumers annually. Since the 1990’s in Africa, however, there’s been a consistent decline in the availability of weather observations. Most weather stations are costly and require highly trained individuals to maintain. As a result, weather stations in African countries have steadily declined over the last seventy years. Oregon State University’s, Dr. John Selker and his partners intend to remedy the problem through his latest endeavor— the Trans African Hydro Meteorological Observatory (TAHMO).

weather stations

Weather data improve the lives of many people. But, there are still parts of the globe where weather monitoring doesn’t exist.

Origins of TAHMO

TAHMO is a research-based organization that aims to deploy 20,000 weather stations across the continent of Africa in order to fill a hole that exists in global climate data. TAHMO originated from a conversation between Selker and Dr. Nick van de Giesen from Delft University of Technology while doing research in Ghana. Having completed an elaborate study on canopy interception at a cocoa plantation in 2006, they hit a “data wall.” There was virtually no weather data available in Ghana, a problem shared by most African countries. This opened the door to what would later become TAHMO.

weather stations

The majority of weather stations are being installed at local schools where teachers are using the data in their classroom lessons.

Logistics and Equipment

Originally Selker and van de Geisen set out to make a $100 weather station, which Selker admitted, “turned out to be harder than we thought.” Not only was making a widely-deployable, affordable, research-grade, no-moving-parts weather station difficult, but additional challenges presented themselves.

“The model of how we might measure the weather in Africa, the whole business model, the production model, infrastructure support, the data-base and delivery system, the agreements with the countries, agreements with potential data-buyers, that all took us a long time to sort-out.” Despite these challenges, in 2010 it started to look feasible. “That’s when we really started to figure out what the technology we were going to use was going to look like.”

After giving a lecture at Washington State University, Selker spoke with Dr. Gaylon Campbell about the project, which led to a long development-deployment-development cycle. Eventually the final product emerged as a low-maintenance, no-moving-parts, cellular-enabled, solar-powered weather station.

weather stations

An estimated 60 percent of the African population earn their income by farming.

Agricultural Benefits of Weather Stations

Crop insurance, a service that is widely used in developed countries, relies on weather data. Once historical data exists, insurance rates can be set, and farmers can purchase crop insurance to replace a crop that is lost to drought, weather, wildfire, etc. On a continent with the largest percentage of the total population subsistence farming, this empowers farmers to take larger risks. Without insurance, farmers need to conserve seed, saving enough to eat and plant again if a crop fails. With crop insurance, crop loss is not as devastating, and farmers can produce larger yields without worrying about losing everything. Hypothetically, this would lead to more food available to the global market, stabilizing food prices year over year.

Crop insurance aside, weather data provide growers with information like when to plant, when not to plant, what crops to plant, and when and if to treat for disease. For rainfed crops, this can mean the difference between a successful yield and a failure.

“Currently in most African countries, the production per acre is about one-sixth of that in the United States. That is the biggest opportunity, in my opinion, for sustainable growth without having to open up new tracts of land. The land is already under cultivation, but we can up productivity, probably by a factor of four, by giving information about when to plant,” Selker comments.  

Despite the social benefits, Selker makes it clear that the TAHMO effort is based on mutual benefit: “We are here for a reason, we want these data to advance our research on global climate processes.  This is a global win-win partnership.”

Learn how you can help TAHMO by getting active.

Next week:  Read about some of the challenges facing TAHMO

Best Research Instrument Hacks

Sometimes, brilliant ideas are born out of necessity.  We wanted to highlight innovative ways people have modified their instrumentation to fit their research needs.  Here, Georg von Unold, founder and president of UMS (now METER) illustrates ingenuity in a story that inspired the invention of the first UMS tensiometer and what could be one of the greatest scientific instrument hacks of all time.

Instrument hacks

The Bavarian Alps

An Early Penchant for Ingenuity

In 1986, graduating German students were required to join the military or perform civil service.  Von Unold chose to do a civil service project investigating tree mortality in the alpine region of the Bavarian Mountains.  He explains, “We were trying to understand pine tree water stress in a forest decline study related to storms in certain altitudes where trees were inexplicably falling over. The hypothesis was that changing precipitation patterns had induced water stress.”  

To investigate the problem, von Unold’s research team needed to find tensiometers that could measure the water stress of plants in the soil, which was not easy. The tensiometers von Unold found were not able to reach the required water potential without cavitating, so he decided to design a new type of tensiometer.  He says, “I showed my former boss the critical points. It must be glued perfectly, the ceramic needed defined porosity, a reliable air reference access, and water protection of the pressure transducer. I explained it with a transparent acrylic glass prototype to make it easier to understand. At a certain point my boss said, “Okay, please stop. I don’t understand much about these things, but you can make those on your own.”

Instrument hacks

Two snorkels protected a data logger predecessor from relative humidity.

Snorkels Solve a Research Crisis

The research team used those tensiometers (along with other chemical and microbial monitoring) to investigate why trees only in the precise altitude of 800 to 1100 meters were dying. One challenge facing the team was that they didn’t have access to anything we might call a data logger today.  Von Unold says, “We did have a big process machine from Schlumberger that could record the sensors, but it wasn’t designed to be placed in alpine regions where maximum winter temperatures reached -30℃ or below. We had to figure out how to protect this extremely expensive machine, which back then cost more than my annual salary.“

Von Unold’s advisor let him use the machine, cautioning him that the humidity it was exposed to could not exceed 80%, and the temperature must not fall below 0℃.  As von Unold pondered how to do this, he had an idea. Since the forest floor often accumulated more than a meter of snow, he designed an aluminum box with two snorkels that would reach above the snow.  The snorkels were guided to a height of two meters.  Using these air vents, he sucked a small amount of cold, dry air into the box. Then, he took his mother’s hot iron, bought a terminal switch to replace the existing one (so it turned on in the range of 0-30℃), and mounted a large aluminum plate on the iron’s metal plate to better distribute the heat.

Von Unold says, “Pulling in the outside air and heating it worked well. The simple technique reduced the relative humidity and controlled the temperature inside the box. Looking back, we were fortunate there wasn’t condensing water and that we’d selected a proper fan and hot iron. We didn’t succeed entirely, as on hot summer days it was a bit moist inside the box, but luckily, the circuit boards took no damage.”

Instrument hacks

Tree mortality factors were only found at the precise altitude where fog accumulated.

Finding Answers

Interestingly, the research team discovered there was more to the forest decline story than they thought. Fog interception in this range was extremely high, and when it condensed on the needles, the trees absorbed more than moisture.  Von Unold explains, “In those days people of the Czech Republic and former East Germany burned a lot of brown coal for heat. The high load of sulphur dioxide from the coal reduced frost resistivity and damaged the strength of the trees, producing water stress.  These combined factors were only found at the precise altitude where the fog accumulated, and the weakened trees were no match for the intense storms that are sometimes found in the Alps.”  Von Unold says once the East German countries became more industrialized, the problem resolved itself because the people stopped burning brown coal.

Share Your Hacks with Us

Do you have an instrument hack that might benefit other scientists?  Send your idea to kcampbell@metergroup.com.

New Infiltrometer Helps City of Pittsburgh Limit Traditional Stormwater Infrastructure (Part 2)

To save the aesthetics of Dellrose Street, an aging, 900 ft. long, brick road, the city of Pittsburgh wanted to limit traditional stormwater infrastructure (see part 1). Jason Borne, a stormwater engineer for ms consultants and his team decided permeable pavers was a viable option, and used two different types of infiltrometers to determine soil infiltration potential.  Here’s how they compared.

double ring infiltrometer

Setting up the infiltrometers.

Shortened Test Times Allow Design Changes on the Fly

Though most of the subsoil was a clay urban fill, there was a distinct transition between that clay material to a broken shale/clay mixture.  Borne says, “After excavation, it rained, and we saw that the water was disappearing through the broken shale/clay material.  When we did the infiltration tests, the broken shale/clay showed a higher infiltration potential than the clay fill material.  That led us to modify the design of the subsurface flow barriers based on specific observed infiltration rates of the subsoils. Where the tests showed higher hydraulic conductivity values, we were able to rely on infiltration entirely to remove the water from behind the check dams.”  Borne adds that in the areas where infiltration was poor, they augmented infiltration with a slow release concept. “We put some weep holes in the flow barrier and let the water trickle out down to the next barrier and so on.  Basically, the automated SATURO infiltrometer allowed us to do many tests in a short amount of time to establish a threshold of where good infiltrating soils and poor infiltrating soils were located.  This enabled us to change the design on the fly.  The double ring infiltrometer takes significantly more time to do a test, and time is of the essence when the contractor wants to backfill the area and get things moving. It was nice to have a tool that got us the information we needed more rapidly.”

double ring infiltrometer

SATURO Infiltrometer

How did the Double Ring and SATURO Compare?

Borne says the SATURO Infiltrometer was faster and reduced the possibility of human error.  He adds, “We liked the idea of it being very standardized. The automated plot of flux over time was also of great interest to us, because we could see a trend, or anomalies that might invalidate the results we were getting. The double ring infiltrometer takes a long time to achieve a state of equilibrium, and it’s hard to know when that occurs. You’re following the Pennsylvania Department of Environmental Protection suggested guidelines, but they’re very generalized.  To me it doesn’t suit all situations.  What we found with the SATURO infiltrometer is it records information at very discreet intervals, plots a curve of the flux over time, and when it levels out, you basically achieve equilibrium.  You get to that state of equilibrium faster.  There’s a water savings, but there’s also a time savings.  And there’s the satisfaction of getting standardized results rather than the possibility of each technician applying the principles in a slightly different way, as they might with the double ring infiltrometer.”

Borne and his team were ultimately able to prepare a permeable paver street design which allowed for the exclusion of traditional storm sewer infrastructure, reducing both capital costs and long-term maintenance life cycle costs. The permeable paver concept is intended to provide a template for the city of Pittsburgh to apply to the future reconstruction of other city streets.

Get more information on applied environmental research in our

New Infiltrometer Helps City of Pittsburgh Limit Traditional Stormwater Infrastructure

Though difficult and expensive to repair, the brick-paved streets that still exist in some Pittsburgh, Pennsylvania neighborhoods are worth saving. Dellrose Street, an aging, 900 ft. long, brick road, was in need of repair, but the city of Pittsburgh wanted to limit traditional stormwater infrastructure, such as pipes and catch basins.

Infiltrometer

Dellrose Street permeable paver system

To save the aesthetics of the neighborhood, they hired ms consultants, inc. to design a permeable paver solution for controlling stormwater runoff volumes and peak runoff rates that would traditionally be routed off-site via storm sewers.  Jason Borne, a stormwater engineer for ms consultants who worked on the project says, “What we try to do is understand the in situ infiltration potential of the subsoils to determine the most efficient natural processes for attenuating flows; either through infiltrating excess water volume back into the soil or through slow-release off-site.”  He used the SATURO Infiltrometer to get an idea of how urban fill material would infiltrate water.

Green Infrastructure Aids Natural Infiltration

As Borne and his team investigated what they could do to slow down the runoff, they decided permeable pavers would be a viable solution.  He says, “There’s not much you can do once you put in a hardened surface like a pavement.  Traditional pavement surfaces accelerate the runoff which requires catch basins and large diameter pipes to carry the runoff off-site. We were interested in investigating what some of the urban subsoils, or urban fill would allow us to do from an infiltration perspective.  As we started looking at some of these subsoils, we decided a permeable paver system would be ideal for this particular street.”

Infiltrometer

Subsurface flow barrier installation

Infiltrometers Determine Natural Infiltration Potential

Once the water flowed into the aggregate, the team began to figure out ways to slow it down and promote infiltration.  Borne says, “Basically we came up with a tiered subsurface flow barrier system.  We had about 60 concrete flow barriers across the subgrade within the aggregate base of the road. We needed so many because the longitudinal slope of the road was fairly significant. Behind each of these barriers we stored a portion of the stormwater that would typically run off the site.  The ideal was to remove the stored water through infiltration–to get it down to the subgrade and away, so we used infiltrometers to help us establish where we could maximize infiltration and where we might need to rely on other management methods.”

A Need for Faster Test Times Inspires a Comparison

Borne says that USDA soil surveys are too generalized for green infrastructure applications in urban areas and only give crude approximations of the soil hydraulic conductivity. Understanding the best way to promote natural infiltration requires a very specific infiltration rate or hydraulic conductivity for the location of interest.  He says, “The goal is to excavate down to the desired elevation before construction and find out, through some kind of device what the infiltration potential of the subsoil is.  Typically we use a double ring infiltrometer, but it’s a very manual device. We’re constantly refilling water, and it requires us to be on-site and attentive to what’s happening.  We can’t really multitask, especially in areas of decently infiltrating soils where the device might run out of water in 30 minutes or less. So, in the interest of saving water and time, we used the automated SATURO infiltrometer and the manual double ring infiltrometer concurrently for comparison purposes.”

Next week:  Find out how the two infiltrometers compared.

Get more information on applied environmental research in our

How to Get More From Your NDVI Sensor (Part 3)

In the conclusion of our three-part series on improving NDVI sensor data (see part 2), we discuss how to correct for limitations which occur in high leaf area index (LAI) conditions.

NDVI Sensor

Where there’s a large amount of vegetation, NDVI tends to saturate.

NDVI Limitations – High LAI

NDVI is useful in the midrange of LAI’s as long as you don’t have strong soil effects, but as you approach an LAI above 4, you lose sensitivity. In figure 6, loss of sensitivity is primarily due to a saturation in the red band. Measurements were taken in a wheat canopy and a maize canopy. The near infrared reflectance is sensitive across the entire spectrum of the wheat and maize canopies, but the red saturates relatively quickly. Where the red starts to saturate is where the NDVI starts to saturate.

NDVI Sensor

Figure 6: Gitelson (2004) J. Plant Phys

Note: NDVI saturates at high LAI’s, however, if your purpose is to get at the fractional interception of light, NDVI tends not to have the saturation issue. In Figure 7, Fpar or the fractional interception of light of photosynthetically radiation is nearly complete far before NDVI saturates. This is because canopies are efficient at intercepting light, and once we get to an LAI of about 4, most of the light has been intercepted or absorbed by the canopy.  Thus, incremental increases in LAI don’t significantly affect the FPar variable.

NDVI Sensor

Figure 7: Fractional interception of light is near complete at an LAI around 4. (Gamon et al. (1995) Eco. Apps)

Solution 3- WDRVI

One solution for the NDVI saturation issue is called the Wide Dynamic Range Vegetation Index (WDRVI). It’s formulation is similar to NDVI, except for a weighting coefficient that can be used to reduce the disparity between the contribution of the near infrared and red reflectance.  

NDVI Sensor

In the WDRVI, a is multiplied by the near infrared reflectance to reduce its value and bring it closer to the red reflectance value. In doing so, it balances out the red and the near infrared contribution to the vegetation index.

NDVI Sensor

Figure 8: (Gitelson (2004) J. Plant Phys)

a can range anywhere from 0 to 1. Figure 8 shows that as we use a smaller value of a, we get an increasing linear response of the wide dynamic vegetation index to LAI.

The only drawback of the WDRVI is that the selection of a is subjective. It’s something that you experiment on your own until you find a value for a that is optimal for your solution.  People tend to err on the side of a very low value simply because they’ll get closer and closer to a linear response to LAI as a decreases.

Solution 4 – Enhanced Vegetation Index

The enhanced vegetation index (EVI) was designed to enhance sensitivity in high biomass ecosystems, but it also attempts to reduce atmospheric influences.  This was a vegetation index created for the purposes of a satellite based platform. There’s a lot of atmosphere to look through from a satellite to the ground, and sometimes the aerosols in the atmosphere affect the reflectances in the red and the near infrared regions causing spurious observations.  The EVI also tries to reduce sensitivity of the index to soil. Thus the EVI is a kind of solution to both extremes.

NDVI Sensor

In the EVI equation, the two major inputs are near infrared and red reflectances.  C1 , C2, and L are all parameters that can be estimated, but the blue band is something that has to be measured. Most NDVI sensors are two band sensors, so you don’t have that information in the blue.  Plus, with satellites, the blue band is relatively noisy and doesn’t always have the best quality data, thus EVI has limited value.

Solution 6: EVI2 (Enhanced Vegetation Index 2)

Those problems led a scientist named Jiang to come up with a solution.  Jiang observed quite a bit of autocorrelation between the red band and the blue band, so he decided to try and formulate EVI without the blue band in what he called the EVI2 (Enhanced Vegetation Index 2).  if you’re interested in the mathematics, we encourage you to go read his paper, but here we give you the equation in case you’re interested in using it.

NDVI Sensor

Figure 9

When Jiang calculated his EVI2 and compared it to the traditional EVI (Figure 9), it was nearly a one to one relationship. For all intents and purposes EVI2 was equivalent to EVI.  Since this avoids blue band, it offers some exciting possibilities as it reduces to just using the two inputs of NIR and red bands to calculate NDVI.

NDVI Sensor Summary

NDVI measurements have considerable value, and though there are extremes where NDVI performs poorly, even in these cases there are several solutions.  These solutions all use the near infrared and the red bands, so you can take an NDVI sensor, obtain the raw values of NIR and red reflectances and reformulate them in one of these indices (there are several other indices available that we haven’t covered). So if you’re in a system with extremely high or low LAI, try to determine how near infrared and red bands can be used in some type of vegetation index to allow you to research your specific application.

Get more information on applied environmental research in our

%d bloggers like this: