**1. Introduction**

22 Soybean – Genetics and Novel Techniques for Yield Enhancement

Yoon, M., Lee, J., Kim, C., Kang, J., Cho, E., Baek, H. (2009). DNA profi ling and genetic

Young, H. (2008). *Plasticity of soybean* (*Glycine Max* (L.). Merrill) *root development under mild* 

Zhuang, B. C. (1999). *Biological Studies of Wild Soybeans in China*, Science Press, Beijing (in

*water deficits*. Thesis (M.S.), University of Missouri-Columbia.

*Euphytica*, 165,69-77.

Chinese).

diversity of Korean soybean (*Glycine max* (L.). Merrill). landraces by SSR markers.

Plants require a continuous supply of iron (Fe) to maintain proper growth. Although the most abundant micronutrient in surface soils (Fageria et al., 2002), Fe is the most limiting to agricultural production throughout the world (Kochian, 2000) and to soybean production in the North Central United States (Hansen et al., 2003). Iron deficiency is a complex disorder and occurs in response to multiple soil, environmental, and genetic factors. Iron deficiency chlorosis (IDC) is symptomatic of the disorder and commonly observed on high pH, highly calcareous soils. Planting Fe deficiency- resistant soybean [*Glycine max.* (L.) Merr.] varieties has been promoted as the best strategy to alleviate or avoid Fe deficiency where soybean is grown on high pH, highly-calcareous soils (Fairbanks et al., 1987; Goos and Johnson, 2000; Naeve and Rehm, 2006). However, screening nurseries used to identify more resistant varieties based on visual chlorosis scores (VCS) do not always provide consistent, reliable results. A major obstacle to breeding for Fe chlorosis resistance in soybean has been that Fe deficiency symptoms and resistance scores cannot be consistently replicated among experiments. Inconsistent results preclude precise recommendations. Naeve and Rehm (2006), using nine highly tolerant and one moderately tolerant genotype, concluded that variety evaluation for IDC must be done at multiple IDC prone locations with varying soil chemical factors. One hypothesis is that this lack of consistency is probably due to the complex chemical and physical criteria in both the plant and soil that must be met for chlorosis to occur (Fairbanks, 2000; Naeve and Rehm, 2006). A more accurate and precise estimate of resistance to Fe deficiency may be expressed by a different plant character.

Ideally, plant traits measured to characterize resistance to Fe deficiency would be accurate, precise, simple, rapid, and inexpensive. Few plant traits or measures satisfy all of these requirements. For resistance to Fe deficiency, the "measure of choice" for decades (Weiss, 1943; Cianzio et al., 1979; Froehlich and Fehr, 1981; Fairbanks et al., 1987; Penas, et al., 1990; Goos and Johnson, 2000; Helms et al., 2010) has been a subjective, discontinuous, visual estimate of the degree of chlorosis, i.e. VCS, of the most recently fully-expanded middle leaflet of the third or developmentally younger trifoliolate. Cianzio et al. (1979) concluded that evaluation of foliar chlorosis, rather than measurement of chlorophyll concentration, is the most efficient procedure for comparison of cultivars because it requires relatively less labor. However, visual estimates of chlorosis when only the first trifoliolate leaf is fully

Importance of Seed [Fe] for

number of genotypes into classes.

responses to increasing rates of Fe-EDDHA.

**2.2 High Fe-EDDHA rates** 

(Goos and Johnson, 2001).

responses.

**2.1 Increasing seeding rates with low Fe-EDDHA rates** 

Improved Agronomic Performance and Efficient Genotype Selection 25

Class variances were calculated and tested for homogeneity (Snedecor and Cochran, 1980; Gomez and Gomez, 1984) and when class variances were homogeneous, regression equations were developed using class means consisting of both independent and dependent variables. Management studies involving increasing rates of seeding, increasing rates of Fe-EDDHA application, and increasing rates of N application were conducted using resistant, moderately resistant, and susceptible cultivars, without first categorizing the smaller

It is generally reported that increasing seeding rates will reduce visual chlorosis ratings (early and/or mid-season) and often will increase grain yield when soybean is grown where Fe deficiency is moderate to severe (Uvalle-Bueno and Romero, 1988; Penas et al., 1990; Goos and Johnson, 2001; Lingenfelser et al., 2005; Wiersma, 2007). Increasing seeding density (seeds unit-1 of row), and, presumably, increasing the volume of soil occupied by roots unit-1 of row, can lead to higher yields and higher seed [Fe]s, but may have little influence on early-season VCS (Fig. 1 A, C, E). When averaged across 3 years, 4 replications, 3 cultivars, and 5 rates of Fe-EDDHA, increasing seeding density almost 3-fold reduced visual chlorosis about 12% (Fig. 1 A). On the other hand, increasing Fe-EDDHA rates (in accordance with the severity of IDC) will markedly reduce early season VCSs, but may have little influence on grain yield (Fig. 1 B, D, F). Averaged across 3 years, 4 replications, 3 cultivars, and 5 seeding densities, increasing the Fe-EDDHA rate 4-fold reduced early season visual chlorosis about 70% (Fig. 1 B). Fe acquisition, measured as seed [Fe], appears to be regulated primarily by genotype, yet Fe acquisition by less Fe-efficient cultivars can be increased by increasing SD or reducing the severity of Fe deficiency. It is possible to slightly increase seed [Fe] of both susceptible and resistant cultivars grown under severe chlorosis if high rates (>4.48 kg ha-1) of Fe-EDDHA are used (Table 1; Fig. 1 F). Rates of Fe-EDDHA used in these studies (Fig. 1) were much lower (1.12 to 4.48 kg ha-1) than those evaluated in other studies (2.24 to 11.2 kg ha-1) and may have been responsible for the moderate

Research to reduce or alleviate IDC in soybean by applying various seed, soil, or foliar Fe chelates or fertilizers has been conducted for decades. Although the results have been mixed (Mortvedt, 1986), and are seldom directly comparable, positive responses to foliar (Randall, 1981), seed (Karkosh et al., 1988), and soil (Penas et al., 1990; Wiersma, 2005) application have been reported. Other researchers have observed only small, if any, response to similar treatments (Goos and Johnson, 2000; Goos and Johnson, 2001; Heitholt et al., 2003). Lack of consistent results may be related to differing levels of chlorosis severity among experiments; soil, environmental, or genetic differences; and/or the low rates of Fe often applied to ensure economic feasibility. Low rates of Fe probably do not satisfy the requirement of a continuous supply of Fe as plant development progresses

Responses to higher (beyond economic feasibility) rates of Fe-EDDHA appear variety specific and occur over an extended period, manifest at maturity (Fig. 2). As plant development progresses, there are earlier, limited responses to low rates of Fe-EDDHA, whereas higher rates appear to provide Fe continuously and to promote later, larger

developed (Cianzio et al. 1979) may be more a reflection of planting seed [Fe]1 than resistance to Fe deficiency (Ambler and Brown, 1974; Tiffin and Chaney, 1973; Chaney et al., 1992). Furthermore, Naeve and Rehm (2006) concluded that varietal screening based on VCS likely requires that evaluation is conducted at multiple locations to be predictive. This suggests that using VCS to identify more resistant cultivars may not be the most efficient or least expensive procedure. It has been suggested that the plant character (plant height, seed number, grain yield, seed [Fe], VCS, relative chlorophyll [SPAD] reading) used to measure Fe deficiency is of primary importance in the classification of genotypes for resistance to Fe deficiency (Wiersma, 2007). Many of the characters mentioned are known to vary markedly in screening nurseries as well as in management studies (Helms et al., 2010; Naeve and Rehm, 2006; Wiersma, 2005, 2007, and 2010).

In measuring the indirect effects of recurrent selection for Fe efficiency in soybean, Beeghly and Fehr (1989) reported that Fe efficiency was not associated closely with grain yield, time of maturity, plant height, seed protein or oil, leaflet traits, and most micronutrients, except seed [Fe]. Seed weight declined 12%; seed [Fe] increased 13%; whereas, seed Fe content did not change over seven cycles of selection (Beeghly and Fehr, 1989). For soils known to have yield-limiting availabilities of specific micronutrients, increasing the concentration of that micronutrient in seed used for planting has reduced Mo deficiency in corn (*Zea mays* L.) (Weir and Hudson, 1966), Zn deficiency in several species (Rashid and Fox, 1992), Fe and Zn deficiency in rice (*Oryza sativa* L.) (Gregorio et al., 2000), B deficiency in soybean (Rerkasem, et al., 1997), and Fe deficiency in dry bean (*Phaseolus vulgaris* L.) (Beebe et al., 2000) and wheat (*Triticum aestivum* L.) (Shen et al., 2002). Since seed [Fe] can be regarded as an integrated measure of resistance to Fe deficiency that is manifest at maturity, perhaps seed [Fe] should be considered the "measure of choice" in determining susceptibility or resistance to IDC (Bouis et al., 2003; Nestle et al., 2006).

This chapter presents evidence that supports the use of seed [Fe] as an accurate and consistent measure of genotypic differences in Fe efficiency and agronomic performance. This 'evidence' has been garnered from recent soybean Fe deficiency trials conducted on high pH, highly calcareous soils in the North Central region of the USA (Wiersma, 2005, 2007, and 2010), from variety evaluation trials of the Univ. of Minn. Soybean Plant Breeding and Genetics Project, from IDC nurseries managed by R.J. Goos (http://www.soilsci.ndsu.nodak.edu/yellowsoybeans/) and from varietal trials conducted on partially limed and fully limed, acid soils in Brazil (Spehar, 1994).
