SEARC Agricultural Research 2017

This report is brought to you for free and open access by New Prairie Press. It has been accepted for inclusion in Kansas Agricultural Experiment Station Research Reports by an authorized administrator of New Prairie Press. Copyright 2017 the Author(s).


Introduction
Distillers grains, a by-product of the ethanol industry, have tremendous potential as an economical and nutritious supplement for grazing cattle. Distillers grains contain a high concentration of protein (25 to 30%), with more than two-thirds escaping degradation in the rumen, which makes it an excellent supplement for younger cattle. Recent advancements in the ethanol manufacturing process have resulted in extraction of a greater amount of fat; therefore, creating distillers grains that may contain less energy than corn. This research was conducted to compare performance of stocker cattle supplemented with corn or DDG at 0.5% body weight per head daily while grazing smooth bromegrass pastures.

Experimental Procedures
Thirty heifer calves were weighed on two consecutive days, stratified by weight, and randomly allotted to six 5-acre smooth bromegrass pastures on April 8, 2014 (423 lb), April 7, 2015 (438 lb), and April 6, 2016 (408 lb). Three pastures of heifers were randomly assigned to one of two supplementation treatments (three replicates per treatment) and grazed for 142, 182, and 197 days in 2014, 2015, and 2016, respectively. Supplementation treatments were ground corn or DDG at 0.5% body weight per head daily. DDG used in this study contained 25% protein and 6% fat. Corn was estimated to contain 10% protein and a similar level of energy as DDG. Pastures were fertilized with 100 lb/a nitrogen and P 2 O 5 and K 2 O as required by soil test on February 21, 2014;March 11, 2015;and February 17, 2016. Pastures were stocked with 1 heifer/a and grazed continuously until August 28, 2014;October 6, 2015;and October 20, 2016, when heifers were weighed on two consecutive days and grazing was terminated.
Cattle in each pasture were group-fed corn or DDG in meal form in bunks on a daily basis, and pasture was the experimental unit. No implants or feed additives were used. Weight gain was the primary measurement. Cattle were weighed every 28 days; quantity of supplement fed was adjusted at that time. Cattle were treated for internal and external parasites before being turned out to pasture and later vaccinated for protection from pinkeye. Heifers had free access to commercial mineral blocks that contained 12% calcium, 12% phosphorus, and 12% salt.

Results and Discussion
Cattle gains and supplement intake are presented in Tables 1, 2, and 3, for 2014, 2015, and 2016, respectively. Grazing gains and supplement intake were 2.00 and 2.8 lb/head daily, 2.10 and 2.9 lb/head daily, 1.69 and 3.0 lb/head daily, 1.61 and 3.0 lb/head daily, 1.65 and 2.8 lb/head daily, and 1.64 and 2.9 lb/head daily for heifers supplemented with corn and DDG in 2014, 2015, and 2016, respectively. Gains and supplement intake of heifers supplemented with corn were similar (P > 0.05) to those of heifers that were supplemented with DDG. This would suggest that protein was not limiting performance of heifers grazing these pastures, as heifers fed corn received a similar amount of supplemental energy but less supplemental protein than those fed DDG.  Introduction Supplementation of grazing cattle is most economically feasible when cattle prices are high relative to the price of grain. Energy supplementation of grazing ruminants may reduce forage intake and digestibility, but energy supplementation at low levels (less than 0.4% bodyweight) has been shown to have little effect on forage intake when crude protein was not limiting. Several studies have evaluated the effect of supplementation on stocker cattle gains and forage utilization during the grazing phase, but few have evaluated the effects of supplementation during the grazing phase on subsequent finishing performance and carcass traits. This research seeks to obtain a more thorough understanding of the interactions among grazing nutrition and management, finishing performance, and carcass traits to facilitate greater economic utilization of these relationships.

Experimental Procedures
Steers (108) of predominately Angus breeding were weighed on two consecutive days, stratified by weight, and randomly allotted to nine 5-acre smooth bromegrass pastures on April 9, 2014 (446 lb); April 7, 2015 (488 lb); and April 6, 2016 (444 lb). Three pastures of steers were randomly assigned to one of three supplementation treatments Cattle in each pasture were group-fed supplement in meal form on a daily basis in metal feed bunks, and pasture was the experimental unit. No implants or feed additives were used during the grazing phase. Weight gain was the primary measurement. Cattle were weighed every 28 days. Cattle were treated for internal and external parasites before being turned out to pasture and later were vaccinated for protection from pinkeye. Cattle had free access to commercial mineral blocks that contained 12% calcium, 12% phosphorus, and 12% salt. Forage availability was measured approximately every 28 days with a disk meter calibrated for smooth bromegrass.
After the grazing period, cattle were shipped to a finishing facility, implanted with Synovex S, and fed a diet of 80% whole-shelled corn, 15% corn silage, and 5% supplement (dry matter basis) for 125 and 97 days in 2014 and 2015, respectively. All cattle were slaughtered in a commercial facility at the end of the finishing period, and carcass data were collected. Cattle that grazed these pastures in 2016 were being finished for slaughter at the time that this report was written.

Results and Discussion
Average available forage for the smooth bromegrass pastures during the grazing phase, and grazing and subsequent finishing performance of grazing steers are presented by supplementation treatment for 2014 and 2015 in Tables 1 and 2, respectively. Grazing performance only is presented for 2016 in Table 3. Supplementation treatment had no effect (P > 0.05) on the quantity of forage available for grazing in any year. Pastures grazed by supplemented steers might be expected to have greater available forage DM as consumption of supplement by steers grazing these pastures would likely reduce forage intake thereby resulting in more residual forage. However, the levels of supplement fed in this study were likely small enough that they did not affect forage consumption.
Supplemented steers had greater (P < 0.05) weight gain, daily gain, and steer gain/a than those that received no supplement in all three years. In 2014 and 2016, grazing weight gain, daily gain, and gain/a were not different (P > 0.05) between steers that were supplemented with the starch-based or fat-based supplement. In 2014, steers fed the starch-based supplement had greater (P < 0.05) final finishing liveweight, greater (P < 0.05) hot carcass weight, greater (P < 0.05) overall (grazing + finishing) gain, and greater (P < 0.05) overall daily gain than those that received no supplement. Supplementation during the grazing phase had no effect (P >0.05) on finishing weight gain, feed intake, feed:gain, backfat, ribeye area, yield grade, or marbling score.
In 2015, steers supplemented with the fat-based supplement had greater (P < 0.05) grazing gains than those that received the starch-based supplement. Steers supplemented with the fat-based supplement had higher (P < 0.05) slaughter weight, higher hot (P < 0.05) carcass weight, and lower (P < 0.05) finishing gain than those fed no supplement or supplemented with the starch-based supplement.
Under the conditions of this study, supplementation of stocker cattle grazing smooth bromegrass pasture improved grazing performance and increased slaughter weight and carcass weight. Most of the increase in slaughter weight and carcass weight can be attributed to greater gains of supplemented cattle during the grazing phase. Supplemental energy source while grazing had no effect on carcass quality.   Means within a row followed by the same letter are not significantly different (P < 0.05).

Effects of Interseeding Ladino Clover into Tall Fescue Pastures of Varying Endophyte Status on Grazing Performance of Stocker Steers Introduction
Tall fescue, the most widely adapted cool-season perennial grass in the United States, is grown on approximately 66 million acres. Although tall fescue is well adapted in the eastern half of the country between the temperate north and mild south, presence of a fungal endophyte results in poor performance of grazing livestock, especially during the summer. Until recently, producers with high-endophyte tall fescue pastures had two primary options for improving grazing livestock performance. One option was to destroy existing stands and replace them with endophyte-free fescue or other forages. Although it supports greater animal performance than endophyte-infected fescue, endophyte-free fescue has been shown to be less persistent under grazing pressure and more susceptible to stand loss from drought stress. In locations where high-endophyte tall fescue must be grown, the other option was for producers to adopt management strategies that reduce the negative effects of the endophyte on grazing animals, such as diluting the effects of the endophyte by incorporating legumes into existing pastures or providing supplemental feed. In recent years, new tall fescue cultivars have been developed with a non-toxic endophyte that provides vigor to the fescue plant without negatively affecting performance of grazing livestock. Interseeding legumes into tall fescue cultivars with the toxic endophyte should be an effective way of increasing gains of cattle grazing tall fescue. However, these cultivars lack the competitiveness of highendophyte 'Kentucky 31' and their competitiveness with legumes could be a potential problem. Objectives of this study were to evaluate forage availability, stand persistence, and performance of stocker steers grazing tall fescue cultivars with non-toxic endophyte and high-and low-endophyte 'Kentucky 31' with and without ladino clover. Pasture was the experimental unit and weight gain was the primary measurement. No implants or feed additives were used. Cattle were weighed and forage availability was measured every 28 days with a disk meter calibrated for tall fescue. Cattle were treated for internal and external parasites before being turned out to pasture and later vaccinated for protection from pinkeye. Steers had free access to commercial mineral blocks that contained 12% calcium, 12% phosphorus, and 12% salt. Two steers were removed from the study for reasons unrelated to experimental treatment and replaced with grazers to maintain equal stocking rates. Pastures were grazed continuously until November 29, 2016 (224 days) when steers were weighed on two consecutive days and grazing was terminated.
After the grazing period, cattle were moved to a finishing facility, implanted with Synovex-S (Zoetis, Madison, NJ), and fed a diet of 80% whole-shelled corn, 15% corn silage, and 5% supplement (dry matter basis). Cattle were being finished for slaughter to determine the effect of grazing treatment on subsequent finishing performance at the time that this report was written.

Results and Discussion
Grazing performance is pooled across legume treatment and presented by tall fescue cultivar in Table 1 and pooled across fescue cultivar and presented by legume treatment in Table 2. There were no significant interactions (P > 0.05) between fescue cultivar and legume treatment for cattle performance. However, there was a significant (P < 0.05) fescue cultivar × legume interaction for average available forage DM. Steers that grazed low-endophyte Kentucky 31, HM4, or MaxQ were heavier (P < 0.05) at the end of the grazing period, had greater (P < 0.05) grazing gain, greater (P < 0.05) daily gain, and produced greater (P < 0.05) gain/a than steers grazing high-endophyte Kentucky 31. Average available forage DM of high-endophyte Kentucky 31 pasture was greater (P < 0.05) than that of low-endophyte Kentucky 31, HM4, or MaxQ. MaxQ pasture had greater (P < 0.05) available forage DM than low-endophyte Kentucky 31. Average available forage DM of HM4 pasture was similar (P > 0.05) to that of low-endophyte Kentucky 31 and MaxQ pastures. Steer gains were similar (P > 0.05) between pastures fertilized with an additional 80 lb/a N and those interseeded with ladino clover. Pastures with clover had less (P < 0.05) available forage DM than those without clover for all cultivars except high-endophyte Kentucky 31 where available forage DM of pastures with and without clover were similar (P >0.05). Means within a row followed by the same letter do not differ (P < 0.05). *There was a significant (P < 0.05) fescue cultivar × legume interaction.

Summary
A total of 280 mixed black yearling steers were used to compare grazing and subsequent finishing performance from pastures with 'MaxQ' tall fescue, a wheat-bermudagrass double-crop system, or a wheat-crabgrass double-crop system in 2010, 2011, 2012, 2013, 2014, 2015, and 2016. Daily gains of steers that grazed MaxQ fescue, wheatbermudagrass, or wheat-crabgrass were similar (P > 0.05) in 2010 and 2016. Daily gains of steers that grazed wheat-bermudagrass or wheat-crabgrass were greater (P > 0.05) than those that grazed MaxQ fescue in 2011 and 2012. Daily gains of steers that grazed wheat-crabgrass were greater (P > 0.05) than those that grazed wheat-bermudagrass and similar (P > 0.05) to those that grazed MaxQ fescue in 2013. Daily gains of steers that grazed wheat-crabgrass were greater (P > 0.05) than those that grazed wheat-bermudagrass or Max Q fescue in 2014. In 2015, daily gains of steers that grazed wheat-crabgrass

Introduction
MaxQ tall fescue, a wheat-bermudagrass double-crop system, and a wheat-crabgrass double-crop system have been three of the most promising grazing systems evaluated at the Southeast Agricultural Research Center in the past 20 years, but these systems have never been compared directly in the same study. The objective of this study was to compare grazing and subsequent finishing performance of stocker steers that grazed these three systems.

Experimental Procedures
From 2010 After the grazing period, cattle were moved to a finishing facility, implanted with Synovex-S (Zoetis, Madison, NJ), and fed a diet of 80% whole-shelled corn, 15% corn silage, and 5% supplement (dry matter basis). Finishing diets were fed for 94 days (wheat-bermudagrass and wheat-crabgrass) or 100 days (fescue) in 2010, 98 days (wheat-bermudagrass and wheat-crabgrass) or 96 days (fescue) in 2011, 105 days in 2012, 105 days (wheat-bermudagrass and wheat-crabgrass) or 91 days (fescue) in 2013, 119 days (wheat-bermudagrass and wheat-crabgrass) or 106 days (fescue) in 2014, and 99 days (wheat-bermudagrass and wheat-crabgrass) or 97 days (fescue) in 2015. All steers were slaughtered in a commercial facility, and carcass data were collected. Cattle that grazed these pastures in 2016 were being finished for slaughter at the time that this report was written.

Results and Discussion
Grazing and subsequent finishing performance of steers that grazed MaxQ tall fescue, a wheat-bermudagrass double-crop system, or a wheat-crabgrass double-crop system are presented in Tables 1, 2 , 3, 4, 5, and 6 for 2010, 2011, 2012, 2013, 2014, and 2015, respectively. Grazing performance only for 2016 is presented in Table 7. Daily gains of steers that grazed MaxQ tall fescue, wheat-bermudagrass, or wheat-crabgrass were similar (P > 0.05) in 2010, but total grazing gain and gain/a were greater (P < 0.05) for MaxQ tall fescue than wheat-bermudagrass or wheat-crabgrass because steers grazed MaxQ tall fescue for more days. Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 362, 286, and 258 lb/a, respectively. MaxQ tall fescue pastures had greater (P < 0.05) average available forage dry matter (DM) than wheat-bermudagrass or wheat-crabgrass. Grazing treatment in 2010 had no effect (P > 0.05) on subsequent finishing gains. Steers that grazed MaxQ were heavier (P < 0.05) at the end of the grazing phase, maintained their weight advantage through the finishing phase, and had greater (P < 0.05) hot carcass weight than those that grazed wheat-bermudagrass or wheat-crabgrass pastures. Steers that previously grazed wheat-bermudagrass or wheatcrabgrass had lower (P < 0.05) feed:gain than those that had grazed MaxQ.
In 2011, daily gains, total gain, and gain/a of steers that grazed wheat-bermudagrass or wheat-crabgrass were greater (P < 0.05) than MaxQ fescue. Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 307, 347, and 376 lb/a, respectively. MaxQ tall fescue pastures had greater (P < 0.05) average available forage DM than wheat-bermudagrass or wheat-crabgrass. This was likely due to greater forage production by MaxQ and/or greater forage intake by steers grazing wheat-bermudagrass and wheat-crabgrass. Steers that grazed MaxQ had greater (P < 0.05) finishing gain than those that grazed wheat-bermudagrass and lower (P < 0.05) feed:gain than those that grazed wheat-bermudagrass or wheat-crabgrass. Carcass weight was similar (P > 0.05) among treatments.
In 2012, daily gains, total gain, and gain/a of steers that grazed wheat-bermudagrass or wheat-crabgrass were greater (P < 0.05) than MaxQ fescue. Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 226, 325, and 313 lb/a, respectively. MaxQ tall fescue pastures had greater (P < 0.05) average available forage DM than wheat-bermudagrass or wheat-crabgrass. Grazing treatment had no effect (P > 0.05) on subsequent finishing performance or carcass characteristics.
In 2013, daily gain was greater (P < 0.05) for steers that grazed wheat-crabgrass than for those that grazed wheat-bermudagrass, and daily gain from MaxQ fescue and wheatbermudagrass were similar (P > 0.05). Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 338, 244, and 316 lb/a, respectively. Gain/a was greater (P < 0.05) for MaxQ fescue and wheat-crabgrass than for wheat-bermudagrass. Overall gain was not different between forage systems; however, steers grazed MaxQ fescue for 49 more days than wheat-bermudagrass or wheat-crabgrass. Overall daily gain was greater (P < 0.05) for wheat-crabgrass than for MaxQ tall fescue. MaxQ tall fescue pastures had greater (P < 0.05) average available forage DM than wheat-bermudagrass or wheat-crabgrass and wheat-bermudagrass pastures had more (P < 0.05) available forage DM than wheat-crabgrass. Grazing treatment had no effect (P > 0.05) on subsequent finishing daily gain or carcass characteristics.
In 2014, daily gain was greater (P < 0.05) for steers that grazed wheat-crabgrass than for those that grazed wheat-bermudagrass or 'Max Q' fescue, and daily gain from MaxQ fescue and wheat-bermudagrass were similar (P > 0.05). Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 370, 282, and 383 lb/a, respectively. Gain/a was greater (P < 0.05) for MaxQ fescue and wheat-crabgrass than for wheatbermudagrass. Overall gain and overall daily gain for wheat-crabgrass were greater (P < 0.05) than for wheat-bermudagrass or MaxQ fescue, while overall gain and overall daily gain for MaxQ fescue and wheat-bermudagrass were similar (P > 0.05). MaxQ tall fescue pastures had greater (P < 0.05) average available forage DM than wheat-bermudagrass or wheat-crabgrass and wheat-bermudagrass pastures had more (P < 0.05) available forage DM than wheat-crabgrass. Grazing treatment had no effect (P > 0.05) on subsequent finishing daily gain or carcass characteristics.
In 2015, daily gain was greater (P < 0.05) for steers that grazed wheat-crabgrass than for those that grazed wheat-bermudagrass or MaxQ fescue, and daily gain from wheatbermudagrass was greater (P < 0.05) than for those that grazed MaxQ fescue. Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 291, 337, and 396 lb/a, respectively. Gain/a was greater (P < 0.05) for wheat-crabgrass than for wheatbermudagrass and MaxQ fescue and greater (P < 0.05) for wheat-bermudagrass than MaxQ fescue. Overall gain for Max Q fescue was greater (P < 0.05) than for wheatbermudagrass or wheat-crabgrass, while overall gain for wheat-bermudagrass and wheat-crabgrass were similar (P > 0.05). Overall daily gains were similar (P > 0.05) among forage systems. MaxQ tall fescue pastures had greater (P < 0.05) average available forage DM than wheat-bermudagrass or wheat-crabgrass and wheat-bermudagrass pastures had more (P < 0.05) available forage DM than wheat-crabgrass. Slaughter weight, finishing gains, hot carcass weight, and ribeye area of steers that grazed MaxQ fescue were greater (P < 0.05) and feed:gain was less (P < 0.05) than those that grazed wheat-bermudagrass or wheat-crabgrass. Much of this difference in finishing performance can be attributed to muddier feedlot conditions during the time that the wheatbermudagrass and wheat-crabgrass steers were being finished for slaughter than for the MaxQ fescue cattle.
In 2016, daily gains were similar (P > 0.05) for steers that grazed MaxQ tall fescue, a wheat-bermudagrass double-crop system, or a wheat-crabgrass double-crop system. However, MaxQ tall fescue pastures were grazed 61 days longer and as a result produced greater (P < 0.05) steer grazing gain, heavier (P < 0.05) steer ending weight, and greater (P < 0.05) gain per acre than wheat-bermudagrass or wheat-crabgrass pastures. Gain/a for MaxQ fescue, wheat-bermudagrass, and wheat-crabgrass were 368, 280, and 287 lb/a, respectively. Average available forage DM for MaxQ tall fescue was greater (P < 0.05) than for the wheat-bermudagrass double-crop system or wheat-crabgrass double-crop system and average available forage DM for the wheat-bermudagrass double-crop system was greater (P < 0.05) than for the wheat-crabgrass double-crop system.
Hotter, drier weather during the summer of 2011 and 2012 likely provided more favorable growing conditions for bermudagrass and crabgrass than for fescue, which was reflected in greater (P < 0.05) gains by cattle grazing those pastures. Lack of precipitation also reduced the length of the grazing season for MaxQ fescue pastures in 2012, which resulted in less fall grazing and lower gain/a than was observed for those pastures in 2010, 2011, 2013, 2014, 2015, and 2016.

Summary
The purpose of this study was to determine the effects of two sources of organic trace mineral and two sources of magnesium supplementation on cow performance of springcalving cows on K31 endophyte-infected fescue. The two treatments were organic trace minerals (zinc (Zn), copper (Cu), and manganese (Mn)) offered free choice as an amino-acid chelate with magnesium (Mg) as an amino acid chelate (CHEL) or organic trace mineral supplement with amino-acid complex with magnesium supplied as magnesium oxide (COMP). Mineral was offered free-choice beginning 30 days before breeding season on 4 ranches with 6 pastures per treatment (cows n = 203). Blood samples were collected prior to mineral supplementation and at pregnancy evaluation and serum was analyzed for Mg, Zn, Cu, and Mn. One ranch had an anaplasmosis event, therefore analysis was completed with and without this ranch. Pregnancy rate was not different (P = 0.46) when all 4 ranches were analyzed even though pregnancy rates were 89.3 and 92.9% for COMP and CHEL, respectively. Cows on the COMP mineral calved 6 days earlier (P = 0.04). When removing the anaplasmosis ranch, pregnancy rate was closer to approaching a tendency for a difference (P = 0.15) with pregnancy rates of 95.5 and 87.2% for CHEL and COMP, respectively, with a tendency (P = 0.12) for COMP cows to calve 5 days earlier. All serum mineral levels were lower at pregnancy detection than initial blood draw primarily due to reduction in mineral levels in fescue late in summer and a reduction in intake at the end of the project. Serum Mg tended (P = 0.11) to stay more stable with the CHEL mineral such that the difference in final and initial Mg were similar. Serum Zn, Cu, and Mn were not different (P > 0.10) with the exception of some ranch-to-ranch variations. Additionally, CHEL intake was 6% lower than COMP. Even with the lower intake of the CHEL mineral, serum mineral levels were similar between both treatments; this indicates that CHEL minerals are more bioavailable. Overall, chelated minerals appear to provide an advantage to spring-calving cows on K31 fescue especially from a chelated magnesium source.

Introduction
Failure to breed is the number one culling criteria for beef cattle operations. One of the most difficult management systems to breed cows is spring-calving operations on endophyte-infected fescue. One issue with endophyte-infected fescue is that it can raise the body temperature of the cow, which negatively impacts cow breeding success. Conception issues automatically arise due to the effect of hot weather while breeding, and the increase in body temperature associated with cattle grazing endophyte-infected K-31 fescue. Incorporating management practices to improve reproductive success should lead to increased revenue and sustainability for cattle producers.
Organic forms of mineral have already been established to improve reproductive success, versus non-organic forms. This study will look at the ability of metal amino acid complex versus chelates to offset some of the production issues associated with high-endophyte fescue. Therefore, the objective of this study is to determine the performance effects of supplementing cows on K-31 fescue with metal amino acid chelates (CHEL) versus a metal amino acid complex (COMP).

Experimental Procedures
This experiment was approved by Kansas State University Institutional Animal Care and Use Committee prior to project being completed. Four ranches with a total of 203 spring-calving cows in southeast Kansas and southwest Missouri were used in a completely randomized block design where cows were offered one of two different organic mineral supplements beginning 30 days prior to breeding season and ending at pregnancy exam. The two treatments were free-choice mineral supplied where the copper (Cu), zinc (Zn), and manganese (Mn) are offered in the complex form (Availa-4, Zinpro Corp, Eden Prairie, MN; COMP) or mineral supplied where the Cu, Zn, and Mn are offered in the chelate form (Mineralate-3ChelateBlend, Nutech Biosciences, Inc, Oneida, NY; CHEL). Additionally, magnesium source was different for the two minerals: magnesium (Mg) offered in COMP was magnesium oxide while magnesium in CHEL was amino acid chelated magnesium (Mineralate-Mg 10, Nutech Biosciences, Inc, Oneida, NY). Mineral supplements were balanced to provide equal amounts of all required macro and trace minerals and vitamins with the addition of chlortetracycline (CTC; 0.5 mg/hd/d) for anaplasmosis control (Table 1) and formulated for a 4 ounce/head/day intake. Pregnancy evaluation was completed in the fall of 2015 where three of the four ranches' pregnancy determination was completed by manual palpation and one of the ranches utilized an initial screening blood pregnancy test, then followed that test with manual palpation by a veterinarian.
All mineral was offered to cows using ground mineral feeders (Dura-Bull Mineral Feeder, Pride of Farm, Houghton, IA). Mineral feeders were placed near a water source at all locations. Mineral intake was calculated for each ranch based on amount offered through the project period. Fescue samples were collected (n = 10) in each pasture in June then evaluated for endophyte presence using aniline blue staining of the epidermal strip under a microscope.
Calving dates were recorded for spring-calving cows in 2016 to determine calving distribution.

Results and Discussion
Pasture endophyte infection levels were low in 2015 for all pastures tested. The endophyte levels ranged from 10-25% infection rate with very little variation between pastures within ranches. Mineral intake was higher with the COMP mineral (5% greater) than CHEL mineral (3.2 oz/hd/d vs. 3.0 oz/hd/d, respectively). Water quality was similar between pastures and did not impact overall mineral intake.

Pregnancy Evaluations
Overall pregnancy rate was not different (P = 0.46) for COMP or CHEL, with 89.3% and 92.9% pregnancy rates, respectively (Table 2). Biologically and economically, a 4% difference in pregnancy rate is significant. In two of the four ranches, pregnancy rate was numerically higher for CHEL mineral (Ranch B had a significantly higher pregnancy rate, ~23% -P < 0.05), and at one ranch the pregnancy rates were the same (100% for both minerals). For the Ranch D, there was an anaplasmosis event in the cows receiving the CHEL mineral and subsequently that herd was the only one where pregnancy rates were higher in the COMP treatment (95 vs. 84% for COMP vs. CHEL). There was no significant ranch (P = 0.81) nor ranch by treatment interaction (P = 0.14). Interestingly, cows on the COMP mineral calved 6 days earlier than cows receiving the CHEL mineral (P = 0.04) when evaluating all four ranches (Table 3). The lack of significance in pregnancy rate might be explained by still not having enough cows in the study.
When removing the anaplasmosis ranch, treatments are still not statistically significant, but are closer to approaching significance (P = 0.15) with an even larger difference in pregnancy rates; 95.5% versus 87.2% for CHEL and COMP respectively ( Table 2). When Ranch D was removed from the analysis because of anaplasmosis, there was no difference (P = 0.12) in calving distribution even though numerically the cows on COMP mineral calved 5 days earlier (Table 3).

Blood Mineral
Serum blood mineral levels were lower in the second collection period for all minerals measured, which also corresponded to lower mineral intakes later in the season and a reduction in forage Mg concentrations. Typically, when testing lush growing fescue for Mg the value indicated should meet cow requirements, however, it has been reported that only ~30% of that Mg can be utilized. The recommendation for gestating cows is 7 to 9 g/d of Mg to maintain a blood level of 20 ppm which is the ideal serum concentration. Serum Mg tended to be more stable with the CHEL mineral (P = 0.11) than the COMP mineral as evidenced by the difference in final and initial Mg levels which were -0.21 ppm in CHEL and -2.60 ppm for cows on the COMP mineral. The greater conception rates suggest CHEL allowed cows to maintain a serum Mg at pregnancy check similar to initial levels, despite the decrease of Mg in forage. Additionally, CHEL cows consumed less mineral while maintaining serum Mg concentration, suggesting that CHEL was more bioavailable (Figure 1).
At the initiation of the study, cows on the COMP treatment group tended to have higher serum Mg concentrations (P = 0.14). This might be the explanation for why COMP cows calved earlier in the calving season than cows on CHEL. Additionally, final serum Mg was impacted by ranch and mineral supplementation (P = 0.03; Figure 2) where Ranch B COMP cows had the lowest final Mg.
Serum Zn, Cu, and Mn were not different (P > 0.10) for any treatments (Figure 1). Manganese was not different by treatment, ranch, or the interaction for serum Mn for initial, final, or the difference (P > 0.10) with all levels ~0.04 ppm (~40 ng/mL), which is nutritionally adequate. There was a tendency for a treatment by ranch interaction (P = 0.09) for final Zn where Ranches A and B had lower Zn than Ranches C and D ( Figure 3). Ranches C and D had adequate (> 0.80 ppm) levels of Zn while Ranches A and B were considered marginal (between 0.50 and 0.79 ppm). There was a treatment by ranch interaction (P = 0.02) for final Cu concentration where Ranch C cows on both minerals had greater Cu levels than both minerals for Ranch A and CHEL mineral for Ranches B and D (Figure 4).
At the initiation of the study the average Cu levels would be considered marginal, Zn was adequate, and Mn was adequate. Final Cu levels remained marginal for Ranches A, B, and D while Ranch C had an adequate level according to recommendations. All ranches had adequate final Mn concentrations. Ranch C has the highest levels of Zn and Cu, adequate for both minerals at the end of the study, which might explain the 100% pregnancy rate for cows in all treatments at this ranch. Ranch B had the lowest pregnancy rate for cows on the COMP mineral. The Zn levels were considered marginal for Ranch B COMP cows, and in combination with the lowest Mg levels might potentially explain the reduction in pregnancy rate.

Implications
The amino acid chelated organic trace mineral supplement showed promise to aid in reproductive success for spring-calving producers on K31 endophyte infected fescue. Even though significance was not achieved in this study, economically there was a greater pregnancy rate for cows on chelated trace minerals. Specifically, the amino acid chelated Mg has the potential to be an improved source of Mg in mineral supplements. Magnesium plays a significant role in reproductive success as evidenced by this study. Cattle with higher circulating concentrations of Mg breed earlier; however, a static concentration appears to improve herd-level pregnancy rates, which was observed with the amino acid chelated mineral. Additionally, this study demonstrated that the chelated form of trace minerals were more bioavailable as intake was lower while maintaining serum levels that were equal to or greater than the complex form which had a greater consumption.

Summary
Use of legumes in bermudagrass pastures did not affect summer cow gains in 2016.
Forage availability was also similar where ladino clover was used in the Legume system compared with where Nitrogen (N) alone was used. Estimated forage crude protein (CP) was greater for the Legume than the Nitrogen system in early summer, but was similar by mid-summer.

Introduction
Bermudagrass is a productive forage species when intensively managed. However, it has periods of dormancy and requires proper management to maintain forage quality. Legumes in the bermudagrass sward could improve forage quality and reduce fertilizer usage; however, legumes are difficult to establish and maintain with the competitive grass. Clovers can maintain survival once established in bermudagrass sod, and may be productive enough to substitute for some N fertilization. This study was designed to compare dry cow performance on a bermudagrass pasture system that included ladino and crimson clovers (Legume vs. bermudagrass alone (Nitrogen)).

Experimental Procedures
Eight 5

Results and Discussion
Average available forage is plotted by date (Figure 1), since there was no difference (P > 0.05) between Nitrogen and Legume treatments. The estimated crude protein concentration was greater (P < 0.05) for the Nitrogen than the Legume system in the first sampling, but was higher for the Legume treatment in early summer (Figure 1), likely because of the presence of legumes that contain more protein. By midsummer, estimated CP was similar for the treatments. This was partially due to effects of N fertilizer treatments, and perhaps reduced legume content later on.
Data for cow performance are in Table 1. Gains during the 2016 season were similar (P > 0.05) for the Legume and the Nitrogen systems (Table 1). Means within a row were not significantly different at P = 0.05.

Introduction
Tall fescue (Lolium arundinaceum Schreb.) is the most widely grown forage grass in southeastern Kansas. Its tolerance to extremes in climate and soils of the region is partly attributable to its association with a fungal endophyte, Neotyphodium coenophialum; however, most ubiquitous endophytes are also responsible for production of substances toxic to some herbivores, including cattle, sheep, and horses. Endophytes that purportedly lack toxins but augment plant vigor have been identified and inserted into tall fescue cultivars adapted to the United States. These cultivars and others that are fungus-free or contain a ubiquitous endophyte (i.e. Ky 31 EF and HE, respectively) are included in this test.

Experimental Procedures
The trial was seeded at the Mound Valley Unit of the Southeast Agricultural Research Center in 10-in. rows on Parsons silt loam soil. Plots were 35 × 5 ft and were arranged in four randomized complete blocks. They were fertilized preplant with 20-50-60 lb/a of N-P 2 O 5 -K 2 O and seeded with 20 lb/a of pure, live seed on September 30, 2014. Spring fertilizer (120-50-75 lb/a of N-P 2 O 5 -K 2 O) was applied on February 1, and fall growth was supplemented with 60 lb/a N on August 23, 2015.
Harvest was performed on a 3-ft strip, 16 to 20 ft long from each plot. A flail-type harvester was used to cut to a 3-in. height on May 9, 2016. After harvest, forage was removed from the rest of the plot at the same height. A forage subsample was collected from each plot and dried at 140°F for moisture determination. Summer regrowth was similarly harvested on August 18, and fall growth was harvested on December 6, 2016.

Results and Discussion
Spring 2016 yields ranged from 2.82 tons/a (12% moisture basis) for BarOptima PLUS E34, to 4.77 for NFTF 1051 (Table 1). The latter yielded more (P < 0.05) than 12 of the 19 other entries, and six entries yielded more than the four lowest-yielding entries.
Summer forage production averaged 2.35 tons/a (Table 1). This was more than usual because precipitation at Mound Valley during July 2016 was 60% above the 30-year average. PBU-B2 yield was greater than that of 'PBU-B1,' 'Bar FAF 131,' and 'Martin 2 ProTek,' the latter yielding less than eight entries.
Spring forage dry matter content (Table 1) may be somewhat related to maturity. In that case PBU-B2, 'PBU-B5,' and PBU-B7 may have been most mature at the first cutting. Forage dry matter content on December 6 was lower in 'LE 14-84,' 'LE 14-86,' 'Tower Pro Tek,' and BarOptima PLUS E34 than in the other 16 entries. Low dry matter content in late fall may indicate frost tolerance of forage, since temperatures below freezing began to occur 23 days before harvest.

Introduction
Miscanthus is a productive, efficient genus of warm-season perennial grass. Because of its growth potential and stalk properties, miscanthus has been identified by the U.S. Department of Energy as a possible dedicated energy crop. This study was established to compare cultivars for adaptation in eastern Kansas and to produce biomass to test for suitability as a bioenergy crop.

Experimental Procedures
Two cultivars were planted on 3-ft spacings on May 24, 2012 in four replications at the Mound Valley Unit of the Southeast Agricultural Research Center. The initial soil test indicated 18 and 280 lb/a of available phosphorus (P) and potassium (K), respectively, with 2.0% organic matter and pH 6.2 in a silty clay loam.
Plots were 3 rows, with seven plants per row. Plants were irrigated occasionally in the summer of 2012, but several were replanted in late May through early June 2013. Cultivation was performed for weed control in the summer of 2012 and once in 2013, but no further cultural practices have been performed. The center row of each plot was harvested at 2.5-in. height after each growing season, harvest was conducted on December 1 in 2016. At harvest, biomass was subsampled, dried at 140°F for moisture content, and saved for analysis.

Results and Discussion
Each year, dry matter (DM) production was similar for the cultivars (P > 0.10, Table 1). In 2013, average yield was less than 5,000 lb/a, because only 1.40 in. of rainfall was received between June 5 and July 20, and stands were not fully established. In 2014, 2015, and 2016, DM production did not differ between cultivars or years, averaging 10,970; 10,250; and 8,890 lb/a, respectively. The four-year production thus totaled 35,050 lb DM/a, for an average yield of 8,760 lb/a/yr.
Biomass had similar dry matter content for the two cultivars each year, and for the average across years (P > 0.10, Table 2). However, dry matter content was higher in 2014 than in the other years, and lower than the rest in 2016. The variation was probably most affected by preharvest weather conditions rather than maturity differences, since harvest dates were weeks after the first killing freeze. No difference (P = 0.10) was found between cultivars within or across years.
2 Means of a year followed by a different letter were significantly different at P = 0.05.

Summary
Tall fescue production was studied during a third year at two locations. In 2015, Site 1 was affected by an interaction between nitrogen (N) and phosphorus (P) fertilization rates; while in 2016, Site 2 mainly received production differences by N fertilization rates. Potassium (K) fertilization caused little effect at both sites.
Third-year production of tall fescue was affected by an interaction between nitrogen (N) and phosphorus (P) fertilization rates at Site 1 in 2015, but mainly by N fertilization rates at Site 2 in 2016, with little effect from potassium (K) fertilization at either site.

Introduction
Tall fescue is the major cool-season grass in southeastern Kansas. Perennial grass crops, as with annual row crops, rely on proper fertilization for optimum production; however, meadows and pastures are often under-fertilized and produce low quantities of low-quality forage. This is often true even when new stands are established. The objective of this study was to determine whether nitrogen (N), phosphorus (P), and potassium (K) fertilization improves yields during the early years of a stand.

Experimental Procedures
The experiment was established on two adjacent sites in fall 2012 (Site 1) and fall 2013 (Site 2) at the Parsons Unit of the Kansas State University Southeast Agricultural Research Center. The soil at both sites was a Parsons silt loam soil with initial soil test values of 5.9 pH, 2.8% organic matter, 4.2 ppm P, 70 ppm K, 3.9 ppm NH 4 -N, and 37.9 ppm NO 3 -N in the top 6 inches at Site 1; and 6.5 pH, 2.2% organic matter, 6.7 ppm P, 58 ppm K, 6.8 ppm NH 4 -N, and 12.3 ppm NO 3 -N in the top 6 inches at Site 2. The experimental design was a split-plot arrangement of a randomized complete block. The six whole plots were combinations of P 2 O 5 and K 2 O fertilizer levels allowing for two separate analyses: 1) applying four levels of P 2 O 5 consisting of 0, 25, and 50 lb/a each year and a fourth treatment of 100 lb/a only applied at the beginning of the study; and 2) conducted a 2 × 2 factorial combination of two levels of P 2 O 5 (0, 50 lb/a) and two levels of K 2 O (0, 40 lb/a). Subplots were four levels of N fertilization consisting of 0, 50, 100, and 150 lb/a. Phosphorus and K fertilizers were broadcast applied in the fall as 0-46-0 (triple superphosphate) and 0-0-60 (potassium chloride). Nitrogen was broadcast applied in late winter as 46-0-0 (urea) solid. Second-year samplings and harvests from each site were as follows. Early growth yield as an estimate of grazing potential in early spring was taken at E2 (

Results and Discussion
Third-year production of tall fescue (Site 1 in 2015 and Site 2 in 2016) was affected by an interaction between N and P fertilization at Site 1, but predominantly by N fertilization at Site 2, with little response to K at either site. At site 1 in 2015, early yield at the E2 (jointing) growth stage, to estimate forage available if grazed early, was increased with 50 lb N/acre without P fertilization, but higher N rates did not increase E2 yield (Table 1). However, with P fertilization, early yield at E2 increased with N rates up to 150 lb/a. At R4 hay harvest in 2015, yield was increased by N additions up to 100 lb/a with no P, but with 25 lb P 2 O 5 /acre yield was increased to more than 3 ton/acre with 150 lb N. Fall harvest yield was increased by N rates up to 150 lb/a with no P. However, fall yields that were obtained with higher N rates and P fertilization were lower than with no P and high N rates and the response to N was less. This potentially may be because of residual unused N due to lower R4 yields without P fertilization. Total yield ranged up to nearly 4 ton/a with P fertilization and higher N rates.
For the second year of production at Site 2 (2016), yield was mainly affected by N rate. Sampling at E2 and R4 and fall harvest yields were not affected by P fertilization and response to K fertilization was marginal. Increasing N rates tended to increase yield at the E2 sampling and R4 hay harvest, but response was less defined at the fall harvest (Table 2). Total yield averaged less than 3 ton/a, even at the 150 lb/a N rate.

Summary
In 2016, adding nitrogen (N) greatly improved average wheat yields, but the response to tillage and different N placement methods was minimal. Double-crop soybean yields were unaffected by tillage or the residual from N treatments that were applied to the previous wheat crop.

Introduction
Many crop rotation systems are used in southeastern Kansas. This experiment is designed to determine the long-term effect of selected tillage and N fertilizer placement options on yields of short-season corn, wheat, and double-crop soybean in rotation.

Experimental Procedures
A split-plot design with four replications was initiated in 1983 with tillage system as the whole plot and N treatment as the subplot. In 2005, the rotation was changed to begin a short-season corn/wheat/double-crop soybean sequence. Use of three tillage systems (conventional, reduced, and no-till) continues in the same areas as the previous 22 years. The conventional system consists of chiseling, disking, and field cultivation. Chiseling occurs in the fall preceding corn or wheat crops. The reduced-tillage system consists of disking and field cultivation prior to planting. Glyphosate is applied to the no-till areas prior to planting. The four N treatments for the crop are: no-N (control), broadcast urea-ammonium nitrate (UAN; 28% N) solution, dribble UAN solution, and knife UAN solution at 4 inches deep. The N rate for the corn crop grown in oddnumbered years is 125 lb/a. The N rate of 120 lb/a for wheat is split as 60 lb/a applied preplant as broadcast, dribble, or knifed UAN. All plots except for the no-N controls are top-dressed in the spring with broadcast UAN at 60 lb/a N.

Results and Discussion
In 2016, conventional tillage resulted in 2 bu/a greater yield than with no-till (Table 1). Overall, fertilizing with N quadrupled wheat yield, but preplant application method (broadcast, dribble, or knife) did not affect yields. Average yield of soybean planted doublecrop after wheat harvest was nearly 40 bu/a in 2016, but was not affected by tillage systems or the residual from N fertilizer treatments that were applied to the wheat.

Timing of Side-Dress Applications of Nitrogen for Corn in Conventional and No-Till Systems Introduction
Environmental conditions vary widely in the spring in southeastern Kansas. As a result, much of the N applied prior to corn planting may be lost before the time of maximum plant N uptake. Side-dress or split applications to provide N during rapid growth periods may improve N use efficiency while reducing potential losses to the environment. The objective of this study was to determine the effect of timing of side-dress N fertilization compared with preplant N applications for corn grown on a claypan soil.

Experimental Procedures
The experiment was established in spring 2015 on a Parsons silt loam soil at the Parsons unit of the Kansas State University Southeast Agricultural Research Center. The experiment was a split-plot arrangement of a randomized complete block design with four blocks (replications). Whole plot tillage treatments were conventional tillage (chisel, disk, and field cultivate) and no-till. Sub-plot nitrogen treatments were six preplant/ side-dress N application combinations that include 1) a no-N control, 2) 150 lb N/a applied preplant, 3) 100 lb N/a applied preplant with 50 lb N/a applied at the V6 (sixleaf) growth stage, 4) 100 lb N/a applied preplant with 50 lb N/a applied at the V10 (ten-leaf) growth stage, 5) 150 lb N/a applied preplant with 50 lb N/a applied at the V6 growth stage, and 6) 150 lb N/a applied preplant with 50 lb N/a applied at the V10 growth stage.

Results and Discussion
In 2016, corn yielded 12 bu/a more with conventional tillage than with no-till (Table 1). Even though yield components were not significantly affected by tillage, the combined trend for greater stand, kernel weight, and kernels/ear likely accounted for the yield response to tillage. Adding N fertilizer more than tripled yields obtained in the no-N control. Applying 100 lb N/a preplant followed by 50 lb N/a at the V6 growth state did not improve yields above that obtained with all 150 lb N/a applied preplant. However, delaying the 50 lb N/a side-dress application to the V10 stage improved yield by 8.4 bu/a compared to all N preplant. A similar increase in yield was found by delaying N side-dress to the V10 stage instead of the V6 stage when adding 50 lb N/a extra to a 150 lb N/a preplant application. These effects of N timing on corn yield in 2016 appeared to be related to responses in kernel weight and kernels/ear. Conventional tillage: chisel, disk, and field cultivate. 2 Nitrogen treatments: Control, no N fertilizer; 150 PP, 150 lb N/a applied preplant with no side-dress N; 100 PP/50 V6, 100 lb N/a applied preplant with 50 lb N/a side-dress applied at V6 (six-leaf) growth stage; 100 PP/50 V10, 100 lb N/a applied preplant with 50 lb N/a side-dress applied at V10 (ten-leaf) growth stage; 150 PP/50 V6, 150 lb N/a applied preplant with 50 lb N/a side-dress applied at V6 growth stage; and 150 PP/50 V10, 150 lb N/a applied preplant with 50 lb N/a side-dress applied at V10 growth stage.

Response of Soybean Grown on a Claypan Soil in Southeastern Kansas to the Residual of Different Plant Nutrient Sources and Tillage 1 Introduction
Increased fertilizer prices in recent years, especially noticeable when the cost of phosphorus spiked in 2008, have led U.S. producers to consider other alternatives, including manure sources. The use of poultry litter as an alternative to fertilizer is of particular interest in southeastern Kansas because large amounts of poultry litter are imported from nearby confined animal feeding operations in Arkansas, Oklahoma, and Missouri. Annual application of turkey litter can affect the current crop, but information is lacking concerning any residual effects from several continuous years of poultry litter applications on a following crop. This is especially true for tilled soil compared with no-till, because production of most annual cereal crops on the claypan soils of the region is often negatively affected by no-till planting. The objective of this study was to determine if the residual from fertilizer and poultry litter applications under tilled or no-till systems affects soybean yield and growth.

Experimental Procedures
A water quality experiment was conducted near Girard, KS, on the Greenbush Educational facility's grounds from spring 2011 through spring 2014. Fertilizer and turkey litter were applied prior to planting grain sorghum each spring. Individual plot size was 1 acre. The five treatments, replicated twice, were: Control -no N or P fertilizer or turkey litter -no-till; Fertilizer only -commercial N and P fertilizer -chisel-disk tillage; Turkey litter, N-based -no extra N or P fertilizer -no-till; Turkey litter, N-based -no extra N or P fertilizer -chisel-disk tillage; and Turkey litter, P-based -supplemented with fertilizer N -chisel-disk tillage.
Starting in 2014 after the previously-mentioned study, soybean was planted in the plots with no further application of turkey litter or fertilizer. Prior to planting soybean, tillage operations were done in appropriate plots as in previous years. A subarea of 20 × 20 ft near the center of each 1-acre plot was designated for crop yield and growth measurements. Samples were taken for dry matter production at V3-V4 (approximately 3 weeks after planting), R2, R4, and R6 growth stages. Yield was determined from the center 4 rows (10 × 20 ft) of the subarea designated for plant measurements in each plot.

Results and Discussion
During 2014-2016, the residual effects of turkey litter and fertilizer amendments affected soybean yield and pods/plant ( Table 1). The two treatments which had previously received a high application rate of turkey litter based on N requirements, regardless of tillage system, resulted in greater yields than from plots that had received low rates of turkey litter (P-based), commercial fertilizer, or no fertilizer N or P. Even though the average number of pods/plant was greatest where N-based turkey litter had been applied with no-till, the stand tended to be lower than where the N-based turkey litter was incorporated with tillage, but was only significant in 2015 (year interaction data not shown). Dry matter production was greatest early (V3) and late (R6) in the season where N-based turkey litter had been applied and incorporated with tillage than in the other residual treatments (Table 1).

Introduction
Crop production is dependent on many factors, most notably, environmental conditions during the growing season. Here, we present a summary of environmental conditions experienced during the 2016 growing season in comparison to previous years and the historical averages. Information on crop yields is taken from reported yields from variety trials and research plots in southeast Kansas.

Experimental Procedures
The Kansas State University Crop Performance Tests were conducted in replicated research field plots throughout the state. This report summarizes crop production for southeast Kansas. Wheat, sorghum and sunflowers were grown at the Parsons facility. Soybean varieties were grown at Columbus (upland) and Erie (river bottom). Corn was grown at Erie (full season) and Parsons (short season). Both corn variety tests were abandoned due to crop loss. Please see individual variety results at the K-State Crop Performance Test web page (http://www.agronomy.k-state.edu/services/cropperformance-tests/).
Weather information was collected from the Kansas State Mesonet site (http:// mesonet.k-state.edu/weather/historical/). Historical data from the Parsons and Columbus stations were used in preparing these reports.

Results and Discussion
Weather Rainfall Rainfall is highly variable, both spatially and temporally. Total rainfall for 2016 was slightly above the six-year average of 39.21 inches at Columbus with 39.73 inches of rain received during the calendar year. The early spring season was dry at Columbus, but nearly average at Parsons. Columbus had a lengthy dry period (0.65 inches total rain) in June that was broken by 5 inches of rain over a two week period beginning June 30. Two additional periods of heavy rain (6.3 inches from September 8 -16; 6.35 inches from October 4 -12) brought the yearly total to average at Columbus. The largest single-day rain event of 2016 in Columbus was recorded on October 6 at 4.43 inches. This storm system brought 6.35 inches of rain to Parsons as well, with the single largest rain event in 2016 of 5.62 inches recorded at Parsons on October 6. Rainfall at Parsons was very close to average throughout the winter and spring. The two-week period from June 23 until July 7 had 9.33 inches of rain. This storm increased total rainfall in Parsons to above average, where it remained for the rest of the calendar year. Rainfall at Parsons was well above the 31.15 inch average, with 44.64 inches of rain during the calendar year.

Temperature
Temperature is a critical determinant of crop growth and performance. Many studies rely on growing degree days or growing degree units to estimate crop growth and development. We have shown that crop growth is also sensitive to the number of days above 90 (corn) and 95 (soybeans). We started the year with an early warm period from mid-June to mid-July, with an above-average number of days exceeding 90°F and 95°F at both Parsons and Columbus (Figure 2). The number of days exceeding 90°F was nearly normal (57 days) at Columbus and slightly below normal (47 days) at Parsons for the remainder of the growing season. The number of days that temperatures exceeded 95°F (12 days, Columbus; 14 days, Parsons) was below normal and similar to temperatures in 2014.

Crop Production
Winter wheat yielded well in 2016 ( Figure 3). Yields for the 22 hard red varieties tested ranged from 57 to 77 bu/a, with an average yield for all hard red wheat of 66 bu/a. Nineteen varieties of soft red wheat were tested, with yields ranging from 53 to 96 bu/a, and an overall average of 72 bu/a (Figure 3). These were above the 6-year averages of 52 bu/a for hard red and 62 bu/a for soft red varieties. Wheat yield and quality are particularly sensitive to high rainfall during maturation (approximately April 24 -May 14). During this period, Columbus received 3.82 inches of rain, less than the 6-year average of 4.04 inches for this time period. Parsons also received less rain (2.14 inches) than the six-year average (2.83 inches). This is significantly below the high rainfall (3.87 inches) received at Parsons during this same period in 2015, which was marked by high rates of Fusarium head blight (FHB) infection. We did have some problems with strip rust this year, which resulted in improved wheat yield with fungicide application.
Corn yields were good in 2016, though not as good as in 2014 (Figure 4). The short season corn variety test at Parsons was abandoned due to wind damage, and the full season test was abandoned due to animal damage. Corn yield results from full season corn from other studies at the research station are presented. Over the past six years, full season hybrids have averaged 183 bu/a, while short season hybrids averaged 118 bu/a for southeast Kansas. No-till corn studies showed slightly higher yields in 2016, averaging 140 bu/a, while conventional tilled corn yielded 117 bu/a at Columbus ( Figure 5).
Soybean yields were also above the 6-year average, with maturity group (MG) 3-4 having an average yield of 53 bu/a, and MG 4-5 yielding 52 bu/a across all varieties and locations ( Figure 5). Conventional soybeans also yielded above the 6-year average, with 35 bu/a for MG 3-4 and 44 bu/a for MG 4-5 ( Figure 6).
In contrast to other crops in 2016, sorghum production (57 bu/a) was much less than average (97 bu/a) ( Figure 6). Sunflowers yielded about average (Figure 7).

Introduction
Charcoal rot is a plant disease caused by the fungus Macrophomina phaseolina (Tassi) Goid. It limits yield and performance of soybean. The fungus is highly prevalent in crop fields in southeast Kansas. Certain plants have been shown to produce chemicals that act as biofumigants, controlling or reducing harmful soil fungi, similar to those that may cause charcoal rot. Bacterial control of diseases has been used successfully in potato (Larkin et al., 2011) and cacao production (Melnick et al., 2008). Mengistu et al. (2009) showed some suppression of charcoal rot infestation with altered tillage and use of rye as a cover crop. The research outlined here tested the ability of mustard species used as cover crops to control charcoal rot in soybean production. Incorporating a cover crop into the crop rotation may be a simple method of controlling soil-borne diseases.

Experimental Procedures
Soybean plants were grown in replicated field plots using two methods of charcoal rot control: chemical (fungal seed treatment) and biological (mustard cover crop). The control had no biological or chemical treatment. The biological treatment was a mustard plant, Mighty Mustard Pacific Gold (Johnny's Select Seed, Winslow, ME). This mustard variety produces high glucosinolate concentrations that are suggested to control soil-borne diseases. Chemical control included seed treatment with fungicide prior to planting (Acceleron, 4 oz/100 lb). The fourth treatment included both biological and chemical treatments.
The mustard seed was planted in late March, when soil temperatures were consistently above 50°F. Prior to maturity, the mustard was terminated with herbicide and disked in after the plants had died. The ground was tilled in all plots in preparation for planting. The soybean cultivars selected include two early maturity group 4s, two late group 4s, and a mid-to early-group 5.
To test the charcoal rot infestation in the soil, soil samples were collected after the mustard was terminated and disked in and prior to planting the soybeans. Additional soil samples were taken in the fall coincident with plant sampling at the R7-8 stage. The numbers of colony forming units (CFU's) of the fungi in the plant and soil samples were measured at the Department of Plant Pathology at Kansas State University. Additional samples were used to determine soil microbial activity with the phospholipid fatty acid (PLFA) assay. Final yield was measured at harvest.

Results and Discussion
Mustard plants reduced the number of colony forming units of the fungus in the soil and in the plant roots ( Figure 1). Therefore, the mustard reduced the disease pressure from the charcoal rot fungus. The interaction between the mustard and the fungicide control was confounded by environmental factors, as each year showed a different response of number of CFUs in the soil to the combined control. A modest, but significant, improvement in yield was observed in 2016 for the combined chemical and biological control (Figure 2). No difference in yield was observed in 2015. Both years of the study had relatively mild summers, with little incidence of charcoal rot damage reported.
The early maturing soybean varieties showed greater infection rates, with higher number of CFUs in the plant stem and roots (Table 1). This was observed in both years. While this may indicate a greater susceptibility of the early-maturing cultivars to charcoal rot, it is more likely a function of the weather patterns in southeast Kansas. Charcoal rot is most prevalent under hot, dry conditions, usually experienced in July and August in southeast Kansas. This is also the time period during which the early maturing varieties would be flowering. The increased sensitivity to charcoal rot may thus be more dependent on the weather than on the genetics of these varieties.

Introduction
Fusarium head blight (FHB) or scab is most commonly observed in wheat in southeast Kansas. However, in 2015, much of eastern Kansas experienced a devastating infection level of FHB. FHB decreases wheat yield, but more importantly, reduces wheat quality due to development of mycotoxins associated with the fungal infection. High levels of vomitoxin or deoxynivalenol (DON) can render the wheat unfit for human consumption, and at very high levels, may not be suitable as a feed grain.

Experimental Procedures
The 2015 wheat harvest season experienced a long period of rain. Wheat that was harvested prior to the rain was generally good, with little fungal infection. Wheat harvested after the rain tended to have a higher rate of FHB. We obtained two groups of wheat seed (cv. Everest) that were harvested early and late from a cooperating farmer from 2015 ( Figure 1). The late-harvested seed was poorer quality, and the farmer performed extra cleaning to try to improve the quality.
Seed was planted in replicated research plots at Parsons in fall 2015. Fungicide treatments included: control (no fungicide); seed treatment; in-season (flag leaf and bloom); and seed treatment + in-season. Plants were harvested at maturity in June 2016. The harvested seed was tested at the Kansas Grain Inspection Service for test weight and protein content.

Results and Discussion
Late-harvested wheat seed was of noticeably poorer quality, with many white kernels ( Figure 1). The late-harvested wheat also had a lower test weight (57) than the earlyharvested seed (63). Both early-and late-harvested wheat seed had levels of DON that rendered the wheat unfit for human consumption, but would allow its use as an animal feed. The late-harvested seed had a much greater number of damaged kernels (data not shown) potentially due to the additional cleaning.
The 2016 harvest season experienced a long dry period, greatly improving the harvested quality of the wheat. Disease pressure in 2016 was minor. However, each additional fungicide treatment showed an additional increase in yield (Figure 2). Seed treatment plus in-season fungicide applications showed a 20-bu/a yield improvement over the untreated control. Although there was no statistically significant difference between the early-and late-harvested seed, the consistent trend showed that the poor seed quality from late-harvested wheat seeds had reduced yields across all treatments. No consistent differences in test weight or protein content were observed between the crops harvested in 2016 based on initial seed quality.  Figure 2. Impact of fungicide treatment on wheat yield for early-harvested ("good") and late-harvested ("bad") wheat seed.

Introduction
Good soil functionality improves the resiliency of the agronomic production system. Soils work in concert with weather, management practices, and genetics to determine the overall yield from a crop. Nowadays, people realize soil is a living organism, and the Natural Resources Conservation Service (NRCS) (2012) has defined soil health as: "The continued capacity of soil to function as a vital living ecosystem that sustains plants, animals, and humans." As described by the NRCS, some of the functions of healthy soils include providing habitat for plants, animals, and soil microorganisms; providing stability and support; providing nutrient cycling; filtering and buffering; and water relations.
Some soil characteristics are commonly measured, such as the physical makeup (clay/ silt/sand content; bulk density, water content, and drainage ability) and the chemical characteristics (pH and nutrient levels, including carbon (C), nitrogen (N), phosphorus (P), potassium (K) and micronutrients). These are important determinants of soil health. The final component that is critical to the overall capacity of soil to provide a "vital living ecosystem" is the biological component. We are learning much more about the factors involved in the biology of soils and their role in soil health.
The biological components of the soil include the plant roots, bacteria, fungi, protozoa, nematodes, arthropods, earthworms, and animals. Some of these are beneficial, for example the Rhizobia bacteria that work with plants to fix nitrogen in certain nitrogenfixing plant species such as soybeans. Arbuscular mycorrhizal fungi (AMF) are a group of beneficial fungi that form close bonds with plants, actually growing into the root cells of vascular plants and helping the plants take up nutrients. Other microorganisms are detrimental, such as the fungi Macrophomina phaseolina that causes charcoal rot.
We know a good bit about how to manage the physical and chemical characteristics of soils to improve their productive capacity. We are learning the importance of biological components, and their contribution to agronomic productivity. Biological soil characteristics are important for their role in integrating physical and chemical characteristics of the soil for optimal productivity.
This report presents the factors that are important for healthy soils. It also describes new research in progress on soil health.

Experimental Procedures
Soil samples were collected from a research field in Columbus, KS, under three management systems: conventional tillage row-crop production (CT); no-till row crop production (NT); and a long-term hay meadow (HM). The soil is a Parsons silt loam, nearly level. Soil samples were collected at different stages during the production cycle (preplant, after planting, at bloom, and at harvest) and separated into 7 different depth intervals (0-2, 2-6, 6-10, 10-14, 14-18, 18-22, and 22-30 inches). The soil samples were processed for microbial community composition by phospholipid fatty acid (PLFA) analysis and for microbial activity by soil enzyme activities analysis.

Results and Discussion
The soil in the field is a Parsons silt loam soil, described by the NRCS Web Soil Survey (2016) as prime farmland, with loess soil over clayey alluvium or clayey residuum weathered from clayey shale. The typical soil profile has productive silt loam soil to a depth of 14 inches, with a somewhat poorly drained silty clay layer, commonly referred to as the claypan, below about 14 inches.
In the surface soils, microbial biomass was greatest in the hay meadow, followed by no-till, and then conventional-till production systems (Figure 1). Subsoils from the HM had the greatest microbial concentration at every depth interval. Soil enzyme activities also followed a similar pattern. Microbial activity was greatest in the surface soils of the HM, followed by NT and CT agricultural systems.
Land management practices impacted microbial community composition ( Figure  2). The fungal fraction was greatest in the soils from the HM, as indicated by the greater ratio of fungi to bacteria. No significant differences in total fungi content were measured between NT and CT soils. However, in contrast to the total fungal populations, the AMF fraction showed a greater concentration in the NT system than in the CT system.
The results also demonstrate a stratified response of soil microbial properties with depth in the soil profile. The land management practices influenced soil biological activity in the upper 6 in. of the soil profile (Table 1). The soil microbial properties in the lower soil profile (below approximately 14 in.) were dependent on the parent material and weathering. The intermediate depths in the soil profile could be influenced by both parent materials and management practices.
These results demonstrate the impact of management practices on soil microbial activity. Because AMF are important in nutrient cycling and nutrient uptake by plants, their increased populations in NT systems improves the productive capacity of the soil. Changes in management practices can have profound impacts on the health of the soil, and hence on its productive capacity.

Introduction
The productive capacity of soil is one of the key components determining yield and quality of crops. While some soil factors can be altered through management, other characteristics cannot be modified and instead must be managed. Soils of southeast Kansas are potentially productive silt loam underlain with an impermeable clay layer. The soils within a field can be highly variable, in part due to fluctuating depths of the silt loam topsoil. Other factors, including topographic position in the landscape, such as whether the area is at the top of a hill or at a low point in the field, alter the productivity of the soil through modification of drainage.
Precision agriculture is a management strategy that seeks to optimize return on investment by matching the production potential of a region within a field with the needed inputs for that level of productivity. The production potential, or productive capacity, is the capacity of the soils to produce at a given level (yield per acre). In precision agriculture, prescribed rates of inputs are developed to apply reduced inputs on areas within a field that have limited production potential, while highly productive areas are given more inputs to support that high level of productivity. This strategy improves net return by putting resources where they are most likely to give the highest return, and reducing application of costly inputs on poorly-performing regions within a field. Precision agriculture is a powerful technology, but requires accurate mapping of within-field spatial variability and knowledge of factors contributing to that variability.
Soil variability is a key component of the spatial variability of plant production and yield observed in production fields. The change in soil characteristics across a field are modified by the growing environment (temperature and rainfall) and management practices (tillage, fertility, etc.). Delineating zones of soil productivity allows develop-ment of prescriptions to match management practices and inputs to the productive capacity of each distinct production zone within a field. Changes in soil characteristics can often be visually detected using changes in soil color. Publically available imagery allows examination of entire fields from aerial images at different times during the production year. Grid or zone sampling measures details of soil characteristics, including texture (sand, silt, and clay content), organic matter, and nutrient content. Soil sampling has its limitations, due to the expense of analysis, and limited coverage.
Electrical conductivity is a measure of how well a material, in this case soil, conducts electricity. The soil's ability to conduct electricity changes as a function of the soil texture (clay, silt, and sand content), organic matter, cation exchange capacity, water content, and the salinity. Soil electrical conductivity varies as a function of several key factors important in crop production. It is also relatively stable, changing very little over the course of a year or for different management practices. Thus, it is a useful tool in defining productivity zones within a crop field. Moreover, it can be used to map the entire field relatively quickly, giving a good measure of the spatial variability of soils over the entire field. These maps can then be used to identify zones of variability to direct soil sampling, or develop prescription maps for site-specific applications.

Experimental Procedures
Crop production fields were selected in collaboration with farmer-cooperators. Yield and plant growth information was collected at harvest. Yields were recorded with commercial yield monitors on production-scale combines, and mapped in SMS Advanced (AgLeader, Ames, IA). Profit maps were developed based on K-State Research and Extension Cost-Return Budgets for corn, soybeans, and wheat grown in southeast Kansas (Ibendahl et al.,MF992,MF993,and MF994). A Veris 3100 system (Veris Technologies, Salina, KS) was used to measure soil electrical conductivity. Soil samples were taken at discrete locations throughout the fields and tested at the Kansas State University Soil Testing Lab in Manhattan, KS, for determination of soil texture and nutrient content.
Historical images of the crop production fields were downloaded from Google Earth. Digital elevation maps (DEMs) were downloaded from the Kansas Data Access and Support Center (http://www.kansasgis.org/resources/lidar.cfm), and were used for terrain analysis of the production fields using ArcGIS 10.1 (Esri, Redlands, CA).

Results and Discussion
Visual inspection of fields gives immediate information on potential regions of variability. Publicly available imagery is also available from Google Earth and other providers (http://nationalmap.gov). By selecting previous years, historical information on field conditions can be examined in Google Earth. This visual imagery can be used to identify potential low-lying areas that may hold water (darker soils), or potential zones of high runoff (lighter soils; Figure 1A). The production field shows a region in the center of the field with wetter (darker) soils, as well as along the terraces. Terraces are seen to drain into the grassed waterway in the west-central southern portion of the field. A region of very light soil is seen in the southeast corner.
Information on specific soil types in the field is available from the Web Soil Survey (https://websoilsurvey.sc.egov.usda.gov/App/HomePage.htm). Three predominant soils are identified in the field: Dennis silt loam, Parsons silt loam, and Kenoma silt loam ( Figure 1B). The complete description, available from the Web Soil Survey, describes the Dennis silt loam as a silty and clayey residuum weathered from shale, with a typical profile of silt loam from 0 -10 inches, with silty clay loam from 10-15 inches and silty clay below 15 inches. Dennis silt loam is in hydrological class C, indicating a layer in the lower soil profile that impedes downward water movement. The Parsons and Kenoma soils have silt loam extending from 0 to 13 inches, with silty clay beginning at 13 inches. Both Parsons and Kenoma are classified in hydrologic soil group D, indicating a very low rate of water infiltration due to a claypan or clay layer near the surface. The Kenoma is weathered from limestone and shale. The restrictive layer begins at about 80 inches in all three soil types. Soils throughout the field are classified as prime farmland.
A Veris 3100 system was used to map apparent electrical conductivity (EC a ) across the entire field ( Figure 2). The Veris system measures EC a through the soil using two arrays of electrodes on coulters. The arrays measure EC a at two depths in the field: 0-10 inches and 0-30 inches. In our measurements, clay content was the largest determinant of soil EC a . High EC a measurements are indicative of soil with high clay content. This is seen to coincide with the region of lighter soil in the southeast corner of the field, and also in the soils to the north of the grassed waterway in the center of the field ( Figure 3A). The soils with the lowest EC a are observed in the center of the field.
Corn yield corresponds closely with EC a , as the lowest yields were measured in the southeast corner, which had the highest EC a ( Figure 3B). The best yields corresponded with the regions of soil with the lowest measured EC a , in the center of the field. Using yield maps from one complete crop rotation (corn/winter wheat/soybeans) over 2 years, and the K-State Cost-Return budgets, we can develop a profitability map of the field (Figure 4). The southeast corner of the field and the area just to the north of the grassed waterways had the lowest profitability. The center of the field had the greatest return. The area to the north of the field had intermediate return.
The measurements demonstrate the extent of soil variability within a production field, and methods of identifying potential sources of that variability. Clay content is one factor contributing to the observed variability in crop production. The EC a measurements of soil gives a spatial map of the variability in soil characteristics throughout the field. We can use this to develop a zone sampling strategy to further delineate sources of soil variability. Other factors contributing to variability in crop production may include topographic position and soil moisture content, which are correlated, as soils at higher elevation will tend to dry out more quickly while low-lying areas will stay wetter. The knowledge can be used to develop site-specific management practices for the field. Implementing precision management practices could improve net return by reducing inputs on regions with low productive capacity.  * Daily values were computed from mean temperatures. Each degree that a day's mean is below (or above) 65°F is counted for one heating (or cooling) degree day.