Cattlemen's Day 2018

Full report, Cattlemen's Day 2018.


Introduction
Sericea lespedeza (Lespedeza cuneata) is a highly fecund noxious weed in Kansas and surrounding states. Individual sericea lespedeza stems are capable of producing more than 700 seeds annually. Because of prolific reproductive capabilities, sericea lespedeza can rapidly infiltrate native and cultivated grasslands; seed can be transported great distances via farm machinery and the alimentary canal of wild and domestic herbivores. Sericea lespedeza infests more than 900 square miles of pasture in Kansas alone, primarily in the Flint Hills region. Resulting degradation to native habitats for wildlife and pasture conditions for domestic herbivores has been devastating.
Traditional management practice in the Kansas Flint Hills involves annual spring burning in April followed by grazing with yearling beef cattle from late April to August. During seasonal grazing, 40 to 60% of annual graminoid production is removed and grazing lands remain idle for the remainder of the year. Under these management practices, sericea lespedeza has steadily expanded into the tallgrass prairie biome. Previous research reported that dormant-season, spring fires may stimulate sericea lespedeza seed germination. Additionally, application of growing season fire at three-year intervals decreased the rate of sericea lespedeza invasion. Therefore, the objective of our study was to evaluate the effects of annual prescribed burning applied during the growing season on vigor of sericea lespedeza infesting native tallgrass range.

Experimental Procedures
A 125-acre native tallgrass pasture located in Geary County, KS, was used for our study. The site was historically grazed during the winter and spring by beef cattle; moreover, the infestation of sericea lespedeza on the site was problematic for the 20-year period preceding our study. This site was divided along watershed boundaries into nine fire-management units (14 ± 6 acres). Unit boundaries were delineated by mowing firebreaks (≈ 20-ft wide) around each perimeter. Units were assigned randomly to one of three prescribed-burning times (n = 3/treatment): early spring (April 1), mid-summer (August 1), or late summer (September 1). Prescribed burns were carried out on or near target dates when appropriate environmental conditions prevailed: surface wind speed < 15 mph; surface wind direction = steady and away from urban areas; mixing height > 1800 ft; transport wind speed > 8 mph; relative humidity = 40 to 70%; ambient temperature = 50 to 100°F; and Haines index ≤ 4. All prescribed burning activities were carried out with the permission of Geary County Emergency Services, Junction City, KS (permit no. 348).
Permanent 100-yd transects were established in each fire management unit. Aerial frequency of sericea lespedeza frequency and stem length were measured along each Beef Cattle Management transect (100 × 12-in 2 plot points/transect). Data were collected along transects on average dates July 19 and October 10. Biomass was estimated at 3-ft intervals along transects using a visual obstruction technique. In addition, a 12 × 12-in plot was projected on the side of transects at each point of measurement. Aerial presence of sericea lespedeza was noted on each measurement (e.g., yes or no). If sericea lespedeza was present, stem length was measured in inches from the surface of the soil to its maximum length by manually holding the sericea lespedeza stem erect.
A total of 100 mature sericea lespedeza plants were collected adjacent to permanent transects in each burn-management unit immediately after the first killing frost (approximately November 1). Plants were clipped at ground level and placed into a labeled paper bag. Bagged samples were dried using a forced-air oven. Individual plants in each sample were defoliated by hand. Resulting seeds, chaff, and stems were also separated by hand. The total amount of seed recovered from each sample was weighed to the nearest milligram. Seed weight was converted to seed count, assuming a density of 770 seeds/g. Average seed production was calculated by dividing the number of seeds by the number of sericea lespedeza plants in each sample (n = 100).
In the final year of the experiment, 10 randomly-placed, quarter-meter plots were harvested via clipping along each transect to determine sericea lespedeza biomass within our prescribed-fire treatments. Clipped material was hand sorted into sericea lespedeza biomass and non-sericea lespedeza biomass, dried, and weighed.

Results and Discussion
Canopy frequency and average stem length of sericea lespedeza were not influenced by time of measurement; therefore, main effects of treatment were reported (Table 1). Following four years of treatment, average stem length, and aerial frequency of sericea lespedeza were less (P≤0.01) in mid-and late summer treatments than in the early spring treatment.
Whole-plant dry matter weight of sericea lespedeza at dormancy and seed production per sericea lespedeza plant were greatly diminished (P<0.01) in mid-and late summer treatments compared with the early spring treatment (Table 1). Seed production in areas treated with mid-summer fire was less than 5% of that in areas treated with dormant-season spring fire. In areas treated with late-summer fire, seed production was less than 0.1% of areas treated with dormant-season spring fire. Clearly, the capability of sericea lespedeza to reproduce via seed was sharply curtailed under a growing season fire regime.
The total amount of sericea lespedeza biomass in all treatments, as measured via manual harvest during July of 2017, is shown in Table 1. Biomass of sericea lespedeza in the early treatment was 901 lb dry matter/acre, or 17% of the total biomass. Biomass of sericea lespedeza in the mid-(394 lb dry matter/acre) and late summer treatment (86 lb dry matter/acre) was much less (P≤0.01) than in the early spring treatment (7% and 2% of total biomass for the mid-and late summer treatments, respectively).

Implications
Burning during the summer months for 4 consecutive years resulted in significant decreases in aerial frequency, stem length, seed production, and biomass of sericea lespedeza compared to traditional spring, dormant-season burning. Growing-season prescribed burning is an inexpensive and comprehensive means to control sericea lespedeza propagation and invasion. At the time of this writing, prescribed burning in the Kansas Flint Hills had a cash cost of less than $1/acre, whereas fall application of herbicide was estimated to cost > $18/acre.

Introduction
Fire has been a key to sustaining the ecological integrity of the Kansas Flint Hills. While naturally occurring throughout history, aboriginal peoples used prescribed fire as means to attract wild game animals, manipulate growth of food plants, and to condition wood or stone to make implements. Early European settlers to the Great Plains adopted the use of fire to manage woody-plant encroachment and to enhance growth performance of domesticated herbivores. Fire has since played a pivotal role in sustaining the Kansas Flint Hills. Currently, fire is used as a treatment for the control of invasive species such as eastern red cedar (Juniperus virginiana) and roughleaf dogwood (Cornus drummondii). The most common grazing management practice in the Kansas Flint Hills involves annual spring burning in April followed by intensive grazing with yearling beef cattle from April to August.
Using this current management practice of annual fire and seasonal grazing there has been a steady increase of the invasive plant sericea lespedeza (Lespedeza cuneata). Introduced in the late 19th century for its soil conservation properties, sericea lespedeza has been an aggressive invader in Flint Hills plant communities. Individual plants can produce more than 700 seeds per stem annually; seed can be transported in contaminated hay, via machinery, and via the digestive tract of animals. Up to this time, control of sericea lespedeza has been limited to repeated, costly applications of herbicide, which have met with limited success and resulted in collateral damage to non-target plant species. We reported previously that prescribed burning conducted during either early August or early September had strong suppressive effects on sericea lespedeza. The objective of this report is to document the effects of prescribed burning during April, August, or September on grasses, forbs, and shrubs that are native to the tallgrass prairie region.

Experimental Procedures
Our experiment was conducted on a 125-acre native tallgrass pasture located in Geary County, KS. The site was historically grazed during the winter and spring by beef cattle; moreover, the infestation of sericea lespedeza on the site was problematic for the 20-year period preceding our study. Escort XP (metsulfuron methyl; Bayer Crop Science LP, Research Triangle Park, NC) was broadcast-applied via aircraft onto the site in the fall of 2013 at a rate of 1 oz/acre. Despite herbicide treatment, basal frequency of sericea lespedeza was 2 ± 1.3% and aerial frequency, the percentage of 1 ft × 1 ft plots in which sericea lespedeza was detected, was 36 ± 3.4% the following spring.
The study site was divided along watershed boundaries into 9 fire-management units (14 ± 6 acres). Unit boundaries were delineated by mowing firebreaks (≈ 20 ft wide) Beef Cattle Management around each perimeter. Units were assigned randomly to 1 of 3 prescribed-burning times (n = 3/treatment): early spring (April 1), mid-summer (August 1), or late summer (September 1). Prescribed burns were carried out on or near target dates when appropriate environmental conditions prevailed: surface wind speed < 15 mph; surface wind direction = steady and away from urban areas; mixing height > 1800 ft; transport wind speed > 8 mph; relative humidity = 40 to 70%; ambient temperature = 50 to 100°F; and Haines index ≤ 4. All prescribed burning activities were carried out with the permission of Geary County Emergency Services, Junction City, KS (permit no. 348).
Forage biomass was measured along a single permanent 100-yd transect in each fire-management unit. Transects were read on July 19 and October 10 annually. At 3-ft intervals along each transect, biomass was estimated using a visual obstruction technique. Plant species composition and soil cover were assessed along each permanent transect in mid-July using a modified step-point technique. One hundred transect points were evaluated for bare soil, litter cover, or basal plant area (% of total area). Plants were identified by species; basal cover of individual species was expressed as a percentage of total basal plant area.

Results and Discussion
Total forage biomass was influenced by treatment and measurement date (treatment × time, P<0.01; Table 1). Forage biomass was not different (P=0.78) between treatments on July 19 over the 4-year course of our study. We concluded that repeated burning during the growing season did not impair forage production compared to conventional spring burning.
As expected, forage biomass was greater (P<0.01) in early spring than in mid-or late summer burn units on October 10, after growing-season fire treatments had been applied (Table 1). Prescribed fire treatments on August 1 and September 1 resulted in nearly complete removal of above-ground plant material; however, forage regrowth following fires resulted in significant accumulations of biomass prior to seasonal plant dormancy. Near the end of the growing season (October 10), mid-summer burn units recovered to 39% of pre-fire biomass levels during the 10-week period between treatment application and measurement. Similarly, late summer burn units recovered to 23% of pre-fire levels over the 6-week period between treatment application and measurement. We concluded that post-fire regrowth was likely sufficient to prevent erosion and soil-moisture loss during the subsequent dormant season and would have allowed light to moderate grazing during the ensuing fall and winter.
Frequency of bare soil, litter cover, and total basal plant cover were not different (P≥0.21) between early spring, mid-summer, and late summer burn units (Table 2). Soil cover values were generally indicative of healthy, normal tallgrass prairie ecosystems. Total basal cover attributable to grasses was not different (P=0.24) between prescribed-burn treatments. In addition, combined basal cover of major warm-season grasses (i.e., big bluestem (Andropogon gerardii), little bluestem (Schizachyrium scoparium), indiangrass (Sorghastrum nutans), and sideoats grama (Bouteloua curtipendula)) were also not influenced (P=0.62) by the timing of prescribed burning.
Total basal cover attributable to forbs was not affected (P=0.38) by prescribed burning treatments (Table 2). Basal cover of sericea lespedeza was 4.3-fold greater (P=0.02) in early spring burn units compared with late summer burn units, indicating that growing-season prescribed burning effectively controlled vegetative reproduction by sericea lespedeza. In addition, combined basal cover of western ragweed (Ambrosia psilostachya) and Baldwin's ironweed (Vernonia baldwinii) decreased (P<0.01) in midand late summer burn units compared with early spring burn units. These benefits were accompanied by a tendency for combined basal cover of major wildflowers to increase (P=0.09) in late summer burn units compared with early spring burn units. We speculated that control of sericea lespedeza, western ragweed, and Baldwin's ironweed was the result of selective pressure applied to these species with growing-season prescribed burning, whereas major wildflower species appeared to have been stimulated by growing-season prescribed burning.
Total basal cover attributable to all woody-stemmed plants was not different (P=0.45) between early spring, mid-summer, and late summer burn units (

Implications
Compared to traditional spring, dormant-season prescribed burning, burning during the summer for 4 consecutive years resulted in excellent control of sericea lespedeza, Baldwin's ironweed, western ragweed, and invasive woody-stemmed plants. In addition, major wildflower species prevalence increased in areas treated with prescribed fires during the summer compared with adjacent areas treated with prescribed fire during the spring. Growing-season prescribed burning may be an inexpensive and fairly comprehensive means to control sericea lespedeza propagation. At the time of this writing, prescribed burning in the Kansas Flint Hills had a cash cost of less than $1 USD/acre, whereas fall application of herbicide, which can negatively affect non-target species, was estimated to cost between $18 and $36 USD/acre.  Within row, means with unlike superscripts tend to differ (P≤0.09).

Introduction
Sericea lespedeza (Lespedeza cuneata) is an herbaceous perennial legume native to Asia. Beginning in the 19th century, it was introduced into the United States as both a forage crop and a soil-conservation measure. The broad adaptability of sericea lespedeza and its tolerance of poor quality soils made it a popular choice for re-seeding former strip-mining sites, highway rights-of-way, dams, and waterways. Unfortunately, the same traits that made sericea lespedeza a common selection for re-seeding projects also allowed it to invade native rangelands and pastures. Combining adaptability and hardiness with prolific seed production and allelopathy, sericea lespedeza has invaded more than 950 square miles in Kansas alone, where it was designated as a noxious weed in 2000.
This infestation is particularly pervasive within the Flint Hills region, where it has degraded native prairie ecosystems and reduced carrying capacity of rangelands for beef production. Sericea lespedeza is of little value to grazing cattle due to its high tannin content and inhibition of native grass production. Prescribed pasture burns conducted in March or April, a common component of current Flint Hills grazing systems, have not slowed the encroachment of sericea lespedeza and may have accelerated its spread. Recent research has shown that moving the application of prescribed fire from early April to August or September resulted in a decrease in sericea lespedeza frequency and vigor. Importantly, however, this required a multi-year commitment to late-season burning to achieve comprehensive sericea lespedeza control. Herbicide application alone has also proven to be inadequate to achieve complete control of sericea lespedeza. In many situations, achieving control of sericea lespedeza without the need for costly re-application of herbicide or a long-term commitment to late-season burning may be desired. Therefore, the objective of our study was to evaluate the efficacy of a one-time application of late summer prescribed fire followed by fall herbicide application for substantially reducing sericea lespedeza frequency and vigor.

Experimental Procedures
This study was conducted in Riley County, KS, on a single 80-acre native tallgrass pasture from which hay is routinely harvested during mid-summer. A light to moderate infestation of sericea lespedeza was present at the outset of this experiment. The pasture was divided into 16 units, using existing timber breaks and natural watersheds to form the boundaries of the units where possible. A single, permanent 100-yd transect was established within each unit. Transect endpoints were marked using numbered concrete blocks that remained in place for the duration of the experiment.

Beef Cattle Management
In late August 2016, initial measurements of sericea lespedeza frequency and vigor were taken along these transects. At 3-foot intervals, a 12 × 12-in square plot was projected alongside the transect. Within these plots, sericea lespedeza presence or absence was recorded. If sericea lespedeza was observed within a given plot, 3 additional observations were made: 1) whether multiple sericea lespedeza plants were present, 2) the stem length of the sericea lespedeza plant closest to the 1-yd mark on the transect line, and 3) the crown maturity of the closest sericea lespedeza plant. Stem length was measured by manually holding erect the sericea lespedeza stem and measuring from the ground to the tip of the stem. Crown maturity was evaluated visually. Crowns which contained senescent material or multiple stems were considered mature; all others were considered immature.

Results and Discussion
Prior to treatment application, sericea lespedeza comprised 1.1% of total basal cover and was not different between treatments (P=0.38; Table 1). One year later, sericea lespedeza had increased approximately three-fold to 4.0% of basal cover in control units. This dramatic 1-year increase highlights the invasive capabilities of sericea lespedeza when it remains untreated. Spray only, burn only, and burn-plus-spray units had substantially less sericea lespedeza than controls at the conclusion of the study (P=0.04). Interestingly, these 3 treatments were not different from one another in final sericea lespedeza frequency (P≥0.95).
When initial measurements were obtained, the proportion of individual 12 × 12-in plots containing sericea lespedeza, multiple stems of sericea lespedeza, and mature sericea lespedeza stems were not different between treatments (P≥0.16; Table 1). Following treatment, each of these three parameters followed a similar pattern to one another. The number of plots containing sericea lespedeza, multiple sericea lespedeza stems, and mature sericea lespedeza stems were not different between spray only, burn only, and burn-plus-spray units (P≥0.50); however, they were less than control units for all 3 measures (P≤0.03). The burn-plus-spray treatment was more effective at reducing sericea lespedeza stem length than were the other treatments. The weighted average sericea lespedeza stem length in burn-plus-spray units was the greatest at the outset of the experiment (P=0.03) but was less than in the control units following treatment application (P<0.01). The other treatments were not different (P≥0.10) from controls.
We interpreted these data to suggest that the efficacy of spraying only, burning only, or burning plus spraying for the control of sericea lespedeza was not substantially different in this study. Notably, the level of initial sericea lespedeza frequency in spray-only and burn-only units was low. Therefore, it is plausible that the combination of fall burning and herbicide application is a more effective treatment for rapidly reducing heavy sericea lespedeza infestations. We suspect that this may be the case.
Achieving comprehensive control over sericea lespedeza is only one facet of the restoration process for Flint Hills pastures invaded by this weed. Additionally, a suitable treatment must be cost-effective and have limited detrimental effects on desirable, non-target native plant species. On the basis of cash costs alone, the burn only option is certainly less expensive at approximately $0.75/acre than alternatives that involve herbicide application. The effects of each of these treatments on non-target plant species is the subject of the second portion of this study.

Implications
Restoring pastures degraded by sericea lespedeza encroachment is crucial to protecting and enhancing Flint Hills grazing lands. Applying late summer prescribed fire, with or without fall herbicide application, results in strong suppression of sericea lespedeza. Burning alone appears to be both an effective and low cost means of sericea lespedeza control in areas of light infestation, while combining burning with spraying holds promise as a useful strategy to achieve more rapid control in pastures with heavy sericea lespedeza infestations. Native grassland ecosystems are of tremendous ecological and economical value. They perform a host of critical ecosystem functions, including providing habitat for many native plants and animals, preserving biodiversity, limiting soil erosion, and providing forage for native and domestic grazing herbivores. The tallgrass prairie originally stretched across more than 160 million acres of the United States. Today, this once-expansive grassland has been reduced to less than 5% of its original area.
The largest remaining portion of the historical tallgrass prairie lies in the Kansas Flint Hills. Protected from the plow by shallow, rocky soil, the Flint Hills are largely used today for grazing beef cattle production. These grazing systems utilize abundant, nutrient-dense early-summer forage production for low-cost cattle body weight gains. The dual roles of the Flint Hills region as a valuable ecological remnant and a profitable producer of beef cattle are now both under threat of degradation by a noxious invader: sericea lespedeza (Lespedeza cuneata).
Sericea lespedeza is an herbaceous perennial legume that is both highly invasive to native grasslands and toxic to beef cattle. Recognizing its rapidly increasing damage to native ecosystems and grazing beef cattle production, Kansas has designated sericea lespedeza as a noxious weed. Ranchers and other land managers have employed several strategies in search of comprehensive control of sericea lespedeza. Recent research has demonstrated that late-summer prescribed burning is an effective method of sericea lespedeza control. The first portion of the present study indicated that fall herbicide application following late-summer prescribed burning may be useful to achieve rapid control of heavy sericea lespedeza infestations.
Questions remain, however, about the impact of these sericea lespedeza control strategies on non-target plant species and broader ecosystem health parameters. Restoration of degraded tallgrass prairie is the ultimate goal of strategies designed to control sericea lespedeza. Therefore, the objective of this study was to evaluate the effects of latesummer prescribed burning and fall herbicide application on soil cover, native plant populations, and biological diversity.

Experimental Procedures
A single 80-acre pasture located in Riley County, KS, was used for our study. Hay has been harvested annually from this pasture during mid-summer despite a light to moderate infestation of sericea lespedeza present throughout the pasture. We divided the pasture into 16 distinct experimental units. Where possible, existing watershed boundaries and timber breaks were used to divide the units. Within each of the 16 units, a single, permanent 100-yd transect was established. The endpoints of each transect were identified by steel fence posts accompanied by numbered concrete blocks.
Forage biomass, soil cover, and plant species composition were measured prior to treatment application in August 2016 and again 1 year after treatment in August 2017. Forage biomass was estimated at 1-yd intervals along each 100-yd transect using a visual obstruction technique. Soil cover and plant species composition were measured using a modified step-point method. Along each transect, 100 individual points were randomly selected using a step-point device. Each of these points was classified as a hit on bare soil, litter, or plant basal material. The closest rooted plant and the closest forb in a 180° arc in front of the point were also recorded. These observations were used to calculate the plant species composition of each unit. This summary of plant species composition was then used to calculate the Simpson index, a commonly-used measure of biological diversity.
We randomly assigned each of the 16 units to 1 of 4 treatments: negative control, spray only, burn only, or burn-plus-spray. On September 2-6, 2016, a prescribed burn was conducted on the burn only and burn-plus-spray units. This burn was performed when appropriate weather conditions prevailed and with permission from Riley County Emergency Management, Manhattan, KS (permit no. 1488). We then waited approximately 3 weeks for sericea lespedeza to re-emerge after burning. After this re-emergence was observed, metsulfuron methyl (Escort XP, DuPont, Wilmington, DE) was broadcasted on September 19-26, 2016, at the recommended rate of 1 oz/acre on all spray only and burn-plus-spray units.

Results and Discussion
Although forage biomass was greater prior to treatment application than it was 1 year following treatment, this change was not different between treatments (P=0.12; Table 1). The difference between years is likely attributable to the decrease in summer rainfall in 2017 compared with 2016.
The change in bare soil was not different between (P≥0.09) negative control, spray only, and burn only units; however, bare soil was increased (P=0.01) in the burn-plus-spray treatment when compared to the negative control. Concurrently, litter cover in the burn-plus-spray units was decreased (P=0.01) relative to negative controls. This sharp increase in bare soil and decrease in litter cover is certainly a rangeland health concern. Bare soil may lead to decreased water infiltration, increased water runoff, and increased soil erosion. Additionally, it may provide a seedbed suitable for establishment of invasive species such as sericea lespedeza.
Although the proportion of the soil surface occupied by basal plant cover was not altered relative to controls (P=0.27; Table 1), the plant species composition of this cover differed among treatments. Spray-only units and burn-plus-spray units had increased proportions of grasses (P≤0.05) and decreased proportions of forbs (P≤0.05) compared to control and burn-only treatments. A single late-summer prescribed burn did not shift the plant species composition balance between grasses and forbs in this study. Herbicide application, however, provided preferential selection against forbs, independent of whether or not a prescribed burn was conducted.
Flint Hills native pastures are traditionally dominated by 4 perennial warm-season grass species that form the bulk of grazing cattle diets: big bluestem, little bluestem, indiangrass, and sideoats grama. In our study, the combined cover of these major warm-season grasses was not altered (P≥0.14; Table 1) by burning or spraying alone compared with the negative control but was increased (P<0.01) for the burn-plus-spray treatment compared to the negative control. We calculated the combined cover of native forb species to determine the impact of treatment on forbs other than weedy invaders such as sericea lespedeza. Native forb cover decreased (P=0.01) in burn-plus-spray units and tended to decrease in spray only units (P=0.06) but was not changed in burn only units (P=0.39) when compared to the negative control.
The change in the number of plant species recorded per 100 points was not different (P=0.62; Table 1) between treatments. This indicated that plant species richness, 1 of the 2 components of biological diversity, was not altered over a 1-year period in response to our treatments. We used the Simpson index to measure the species evenness component of biological diversity. This index decreased (P≤0.04) for the burn-plusspray units relative to the other 3 treatments.
These results were interpreted to indicate that a single application of late-summer prescribed fire did not produce substantial changes to the overall health and vigor of native tallgrass prairie. Combining this burning with fall herbicide application, however, may be damaging to native forbs, biodiversity, and overall range health. It is also speculated that this may be the result of combining stress factors on forbs shortly before they enter seasonal dormancy. In the first portion of this study, we noted the potential use of a burn-plus-spray treatment to achieve more rapid control of a heavy infestation of sericea lespedeza. The benefits of curbing a major invasion of sericea lespedeza may make burn-plus-spray a desirable short-term strategy in some instances, but widespread or extended use of this practice should be applied with caution.

Implications
Assessing the overall impact on range health is an essential component of properly evaluating any sericea lespedeza control practice. A single application of late-summer prescribed fire did not produce substantial changes in the vigor, composition, or diversity of major range plant species. Adding a subsequent fall herbicide application, however, did result in an increase in bare soil, a loss of native forb cover, and a decrease in biological diversity. Treatments within a row with unlike superscripts differ (P≤0.05) unless otherwise noted.

Introduction
Limit-feeding high-energy diets based primarily on wet corn gluten feed has been shown to significantly increase efficiency in growing cattle. A trial conducted recently at the Beef Stocker Unit utilized a diet offered ad libitum to determine the amount of feed offered in the limit-fed diets. Results demonstrated significantly improved performance. However, in a production setting, the producer would not feed one diet to determine the level of feed offered for a separate diet. In order to utilize this programmed feeding strategy, specific levels of intake would need to be known without the use of a highroughage, low-energy, control diet to achieve desired gains. In addition, Zelnate-a novel DNA-immunostimulant-had not been previously analyzed under these dietary conditions to determine its effects on cattle health and performance.

Experimental Procedures
A total of 370 Angus × Brahman heifers were assembled from a single source in central Florida and shipped to the Kansas State University Beef Stocker Unit, Manhattan, KS, (1,455 mi) over a 2-day period from August 11 to 12, 2016 (2 loads each day). The heifers were used to validate results observed in an earlier experiment conducted at the Beef Stocker Unit involving high-energy limit-fed receiving diets based primarily on wet corn gluten feed and the use of a novel DNA-immunostimulant technology injected intramuscularly at the time of arrival processing (Zelnate, Bayer Animal Health, Shawnee Mission, KS). The two experimental diets were formulated to provide 50 or 60 Mcal net energy for gain/100 lb dry matter offered at 2.2 and 2.4% of body weight daily, respectively (Table 1). In addition, 1 of 2 arrival management protocols were implemented where animals did or did not receive Zelnate on day 0 in a 2 × 2 factorial arrangement of treatments. Treatment diets were fed for 42 days, then all animals were switched to the 50 Mcal net energy for gain/100 lb dry matter fed for ad libitum intake for 2 weeks to equalize gastrointestinal tract fill. Individual animal weights were measured on days 0, 14, and on conclusion of the trial (day 56). A pen scale was used on days 28 and 42. Feed delivery was adjusted based on updated cattle weights measured at each weigh period on a pen basis.

Results and Discussion
Despite extremely low intakes early in the receiving period, cattle were relatively healthy with 2 animals being pulled within the first two weeks after arrival for malnutrition. Of the animals pulled, both were from the high-energy/Zelnate treatment but most likely succumbed to issues not related to diet considering their lack of eating and overall stress endured by duration of transport. Performance data can be found in Table 2. Neither the use of Zelnate on arrival nor the interaction of diet and Zelnate had an affect on the performance or health parameters measured in this trial. Dietary treatment affected dry matter intake throughout the trial by design, as intakes were limited to specific percentages of average body weight in the pen daily (P=0.01). Average daily gain was not affected by dietary treatment, validating the results from the prior experiment conducted at the Beef Stocker Unit where these prescribed intakes were extrapolated (P=0.75). Because similar gains were achieved while varying the amount of feed delivered, efficiency (gain to feed ratio) was affected by dietary treatment yielding the high-energy, most restricted diet as the more efficient feeding strategy (P=0.03).

Implications
Limit-feeding a diet formulated to supply 60 Mcal net energy for gain/100 lb dry matter at 2.2% of body weight to target similar average daily gain is a more efficient feeding strategy than offering a higher-roughage, lower-energy diet at 2.4% of body weight to achieve similar gain. The use of Zelnate did not have an effect on any of the parameters measured in this trial.

Introduction
Feet and leg structure traits play a critical role in beef cattle production. Poor feet and leg traits rank as a top reason for culling beef cows from a herd and culling is very costly to producers if an animal has not generated enough income to pay for the cost of development. Feet and leg traits are lowly heritable, so genetic progress can be made with selection pressure for improved feet and leg soundness.
Beef producers select for soundness as they make decisions about culling animals; however, many of these animals may have been in a herd long enough to make a negative lasting impact. This causes particular harm to seedstock operations, especially with young unproven bulls being used in artificial insemination and for natural service because it is unknown if they will develop poor feet and leg traits later in life. Developing a feet and leg genetic evaluation to identify problem animals earlier in their life could be one method to make genetic improvement for herd profitability and longevity.
Feet and leg trait evaluation has been present in the dairy industry since the early 1980s and currently ranks as one of the most economically relevant traits in their genetic evaluations. The Australian Angus Association recently incorporated feet and leg traits into their genetic evaluation. The association reports estimated breeding values for front feet angle, front feet claw set, rear feet angle, rear leg hind view, and rear leg side view. Feet and leg traits are typically scored on an intermediate scale as an ordered category with the desirable score in the middle and less desirable scores on the two extremes.
The Red Angus Association of America and American Simmental Association breed organizations are currently developing a genetic evaluation and selection tool for feet and leg traits. The goals of this study were to identify feet and leg indicator traits and develop a scoring method that can be easily adopted by cattle producers for both breeds.

Experimental Procedures
Feet and leg phenotypes were obtained from August 2015 through September 2017 for 14 traits: body condition score, front hoof angle, front hoof depth, front claw shape, rear hoof angle, rear hoof depth, rear claw shape, foot size, front side view, front hoof orientation, knee orientation, rear leg side view, rear leg hind view, and composite score. Scores were obtained by trained livestock evaluators as subjective measurements and each animal was scored by at least two observers. A scale of 1-100 was used for all feet and leg traits (1 and 100 are extreme, 50 is desirable), body condition score was measured on a scale of 1-9 (1 and 9 are extreme, 5 is desirable), and composite score was measured on a scale of 1-50 (1 and 50 are extreme, 25 is desirable). Feet and leg trait phenotypes were later truncated to a 1-9 scale (1 and 9 are extreme, 5 is desirable) and composite score was truncated to a 1-4 scale (1 and 4 are extreme, 2.5 is desirable).
Measurements were collected using an electronic tablet with a scoring survey and offline data storage capacity.
Data were obtained on 1,885 Red Angus cattle, and after editing, 1,720 records were used for analysis. A three-generation pedigree file was provided by the Red Angus Association of America, which contained 13,306 animals. Data were analyzed using multiple bivariate logistic animal models with random effects of additive genetic and residual genetic, and fixed effects of contemporary group and age in months.
Contemporary group was defined as the combination of herd and year born. There were 48 contemporary groups represented in the Red Angus dataset. Bull age ranged from 1 to 2 years of age, with no bulls older than 3 years of age-as most production systems would not have large contemporary groups of mature bulls. Cow age ranged from 1 to 18 years of age, with decreasing number of animals represented with increasing age.

Results and Discussion
An average heritability and standard error estimate for both measures of scale can be found in Table 1. Heritability estimates were moderate for traits such as front hoof angle (0.18-0.20), rear heel depth (0.24-0.25), foot size (0.29-0.36), and rear leg side view (0.29-0.30). This informs producers that genetic selection pressure for these particular traits can result in improved feet and leg conformation. Estimates that were lowly heritable, yet could still exhibit progress through selection were rear hoof angle (0.17-0.19), rear claw shape (0.15-0.17), front side view (0.17-0.17), and rear leg rear view (0.11-0.14). The comparison of heritability estimates from both measurements of scale indicate granularity of scale, in other words using the 9-point scale compared to the 100-point scale, had little effect on heritability differences.

Implications
Feet and leg traits are moderately to lowly heritable; however, producers can still select on traits for improved soundness. Scoring on a less granular measurement of scale (1-9) may be an important simplification for developing an established scoring method for Red Angus and Simmental producers. Further validation on a larger and more diverse population of Red Angus and Simmental cattle could help further understand feet and leg trait differences and their impacts on herd profitability.

Introduction
Defining the breeding objective or goal is the most important step in a breeding program. The objective is a combination of economically important traits in a production system. The economic importance of biological traits to be included in a breeding goal are evaluated by their economic value, or the expected increase in profit resulting from a unit increase in a trait due to selection. Modeling is the main tool for derivation of economic values for important production traits through the application of profit equations or through bio-economic models. According to Roughsedge et al. (2003), bio-economic models integrate complex models of animal biology with principles of farm management and prices of farm inputs and outputs. The objective of this study was to estimate economic values for production traits in a full life cycle system using a bio-economic model with Angus purebred and a terminal crossbreeding system with Nelore sires mated to Angus dams.

Experimental Procedures
Phenotypic data were collected from the Bifequali crossbreeding scheme from the Embrapa Pecuária Sul Research Center of the Brazilian Agricultural Research Corporation (Embrapa), located in the city of Bagé, Rio Grande do Sul State, Brazil. The data consisted of progeny performance and carcass trait phenotype (Table 1) of Angus purebred and Nelore sires mated to Angus dams raised in a pasture-based production system from birth to slaughter.
The economic characterization (Table 2) of the system was based on fixed costs (taxes, depreciation, land opportunity, and opportunity costs of invested capital) and variable costs (sanitation, handling, reproduction, labor, etc.). Since the system was pasturebased, measures of forage consumption was not possible. Instead, the costs of feed were estimated through energy requirements for different animal categories (growing animals, heifers in reproduction, and dams in reproduction) according NRC equations and Buskirk et al. (1992) equations.
The bio-economic model was developed in 'R' programming language using phenotypic performance data and associated production costs. Fertility and survival rates were used to develop a Leslie matrix model that considers the age at first calving of heifers (in this case three years old), pregnancy rates for each age class of dams, and the survival of each animal category. In the Leslie matrix, the herd started with 500 females distributed in categories from 0-to 15-years-old and after a cycle of 500 years the herd stabilized at an inventory of 642 females. In the crossbred scheme, all of the offspring are marketed, so replacement heifers need to be purchased or produced in a separate breeding unit. In the current simulation, purchased replacements were modeled. A replacement rate of 28.5% was modeled using the stayability rate or the probability of a female staying in production to at least six years of age.
In Brazil, slaughter companies have a premium system based on age as measured by dentition and carcass weight (Table 3). These premiums are paid according to the base market price and pricing schedule. Mean carcass weight and its standard deviation determine which category each animal would fit into for this model. Revenues came from the sale of finished steers, cull heifers, and cows.
To estimate the resulting economic values, the bio-economic model was initially parameterized and a base profit calculated. The breeding goal was selected considering a full life cycle production system which defined the traits that have economic importance. Each trait in the breeding objective was sequentially increased one unit without changing the other traits. The difference in profit observed between simulations and profit from the baseline simulation divided by the number of dams generated the relative economic value of respective characteristics. The traits in the breeding objective are mature cow weight, birth rate, yearling weight, live weight at slaughter, carcass weight, dressing percentage, and fat thickness.

Results and Discussion
The profitability of an activity tells us if such activity will be able to continue in the long term. If the profit is positive, the revenue can cover direct expenses, depreciation and also the opportunity costs of land and invested capital. In this study, both systems were profitable; however, the crossbreeding system generated more profit per herd (Angus herd = $16,316.23, Nelore × Angus herd = $30,881.28).
Economic values (Table 4) vary across the two systems due to the difference in the importance of each trait as a return and a cost. Mature cow weight had a positive but smaller economic value because increasing cow weight affects the revenue of cull cows increasing directly, but selection for high mature weight can increase the energetic costs associated with maintenance.
Birth rate is known to affect all sources of revenue and costs. In this case, when the birth rate was changed in the crossbreeding scenario, the costs were higher because of the energetic cost to produce one calf is higher due the weight of the cow and the weight of the calf. Additionally, the marginal value of increasing birth rate through selection is diminished due to the expected higher reproductive rate of the crossbred cows due to heterotic effects.
Dressing percentage had the largest numeric economic value. This occurred because when dressing percentage is increased, the carcass weight is increased and the associate revenue from carcass weight increases. The Nelore × Angus cows had a larger economic weight than Angus cows. Traditionally, crossbreed Zebu animals have higher dressing percentages than British animals. Additionally, Nelore × Angus cows have lower relative weight for the legs, head, hide, and digestive tract.
Beef Cattle Management

Implications
The use of crossbred animals is a good tool to improve economically important traits and profitability in a full cycle beef production system in Brazil.

Introduction
Source and age verification programs are essential for many United States beef export markets and some branded beef programs. While there were some source and age verification programs introduced as early as the mid-1990s, many current programs were introduced from 2004 through 2006. In 2003, the United States had a case of bovine spongiform encephalopathy. This event drastically decreased beef exports and ceased trade agreements with countries such as Japan and South Korea. These beef export markets to Japan and South Korea slowly began to reopen, with age restrictions of beef exported from cattle harvested at a maximum of 20 and 30 months, respectively (USDA, 2017).
Cattle meeting the export age restrictions are identified through source and age verification programs. Beef from enrolled calves can be exported to these restrictive markets.
More recently, after nearly 14 years of market closure following the bovine spongiform encephalopathy case in 2003, the United States signed a trade agreement with China. One of the requirements for beef export to China includes source and age verification of cattle to ensure they are less than 30 months of age at harvest (Inouye, 2017). In the coming years, this may provide more incentive for producers to enroll calves in source and age verification programs. As the beef industry has evolved, more descriptors of calves selling through a video auction service have been communicated from buyer to seller and recorded, including source and age verified status. The objective of this study was to quantify the effect of source and age verification status on the sale price of lots of beef calves sold via summer video auctions from 2010 through 2017 while adjusting for all other factors that significantly influenced price.

Experimental Procedures
Information describing factors about lots sold through a livestock video auction service (Superior Livestock Auction, Fort Worth, TX) were obtained from the auction service in an electronic format. These data were collected for lots of beef calves offered for sale during summer sales from 2010 through 2017.
The descriptive pieces of information that were available for each lot of calves were: The specific and current requirements of each of the video auction service's special health and management programs are available at www.SuperiorLivestock.com.
Factors describing the lots of beef calves that were not numeric in the original file received from the video auction service were classified into well-defined groups, and each group within a factor was assigned a numeric code. A multiple-regression model was developed using a backward selection procedure to quantify the effects of factors on the sale price of beef calves.

Results and Discussion
Data analyzed were collected from 61 summer livestock video auctions from 2010 through 2017. There were 36,570 lots of beef calves used in the analyses. In all years of the analysis, source and age verification influenced the sale price of lots of beef calves. The largest premium associated with lots enrolled in a program was $4.07/cwt (Table  1). In 2014, the smallest premium for source and age verified lots was $1.02/cwt. The average premium from 2010 through 2017 for source and age verified lots was $2.25/ cwt. This premium may not have been enough for the average producer selling calves via video auction to be profitable with the cost and time associated with enrolling in various source and age verification programs.
The percentage of lots of beef calves enrolled in a source and age verified program decreased from 46.5% to 27.5% (Figure 1). Premiums for source and age verified calves marketed in 2017 were greater than those received in 2016. Recently reopened beef export markets for the United States will likely increase the premium and percentage of lots enrolled in source and age programs in the coming years.

Implications
The decision to enroll beef calves in a source and age verification program is ultimately a decision made by the producer and until premiums increase, not all producers may find this value-added management practice profitable.

Introduction
The average cow herd in the United States has shifted to be primarily black hided. In previous years, black hided cattle tended to sell for higher prices and many producers took advantage of this opportunity for a premium on their calves. The black hide color came from an increase in use of Angus genetics, which also allows calves to qualify for various value added or branded beef programs (Wessler, 2011;Eastwood et al., 2017). Producers use specific sire breeds dependent upon their production goals, available resources, and environment. As the cow herd has become primarily black hided, some producers may re-evaluate the sire breeds utilized in their cow herd, potentially to capture hybrid vigor in calves by use of other breeds. The objective of this study was to characterize the potential change in the percentage of lots of beef calves sired by a single breed marketed via video auction from 2010 through July 14, 2017.

Experimental Procedures
Information describing factors about lots sold through a livestock video auction service (Superior Livestock Auction, Fort Worth, TX) were obtained from the auction service in an electronic format. These data were collected for lots of beef calves with a single sire breed offered for sale from 2010 through July 14, 2017. The sire breed of a lot was determined based on lot description information provided by the seller and sales representative. For a beef calf lot to be included in a sire breed category, all calves in a lot must have been sired by a single breed. A minimum of 50 lots of calves were required for a single sire breed to be included in the analysis. The single sire breed categories included in this analysis were Angus, Brangus, Charolais, Hereford, Red Angus, and SimAngus.
The Cochran-Armitage trend test was used to determine the presence of an increasing or decreasing trend in the percentage of lots sired by Angus, Brangus, Charolais, Hereford, Red Angus, and SimAngus bulls over time with a P≤0.05 considered significant. Lots of beef calves originating from the Northeast region (Connecticut, Delaware, Massachusetts, Maryland, Maine, New Hampshire, New Jersey, New York, Ohio, Pennsylvania, Rhode Island, Vermont, and West Virginia) were excluded from the study because of very few lots in this region.

Results and Discussion
There were 29,535 lots of beef calves offered for sale via 178 video auctions through Superior Livestock Auction from 2010 through July 14, 2017 and included in this analysis. The percentage of lots of beef calves sired by Angus bulls decreased (P<0.0001) from 2010 through July 14, 2017 (Table 1). Angus-sired lots, however, comprised the greatest percentage of single-breed sired lots marketed, ranging from 70 to 82% across all years ( Table 2). Percentage of lots of beef calves sired by Brangus, Charolais, Red Angus, and SimAngus bulls increased (P<0.001). However, there was no evidence of change (P=0.16) in the percentage of lots of beef calves sired by Hereford bulls.

Implications
Many producers have likely taken advantage of premiums associated with black hided calves but producers marketing calves via video auction may be changing the genetics of their sires to use on a primarily black cow herd.   Introduction Graber et al. (1985) studied mineral supplementation with stockers grazing Flint Hills native grass pastures and concluded that improvements in performance may or may not occur when investing in this management practice. While most stocker operations today utilize some source of complete mineral, some producers use only salt while their calves are on pasture. The objective of this study was to determine the efficacy of providing salt alone or with injectable trace minerals compared to a complete mineral supplement and growth implants for improving the growth of stocker calves grazing native grass pastures in the Flint Hills region of Kansas.

Experimental Procedures
Crossbred steers originating from Texas and New Mexico (n = 248; 697.8 lb ± 9.6) were randomized by initial weight across 15 pastures. All steers in this study were previously used in a receiving study that focused on limit feeding either wet distillers grains or corn gluten feed at 2% of their body weight (Spore et al., 2018). The final weights of the receiving trial were used to randomly assign each animal to a treatment.
Pastures were randomly assigned to three different treatment groups: (1) Salt block only; (2) Salt block and Multimin90; and (3) a Kansas State University complete mineral (Brazle, personal communication; Table 1) formulated for 3 oz/day daily consumption. Multimin90 is an injectable chelated aqueous supplemental source of trace minerals administered at 1 mL/100 lb body weight (1 mL contains 60 mg zinc, 10 mg manganese, 5 mg selenium, and 15 mg copper).
Within each pasture treatment group, equal number of steers were randomly given either: Ralgro (36 mg zeranol) or Revalor-G implants (40 mg of trenbolone acetate and 8 mg estradiol; Merck Animal Health, Madison, NJ), or no implant. Stocking rates were based on pasture size (avg: 250 lb/acre ± 5.2). All steers were weighed individually on days 0 and 90.
On a weekly basis, the mineral feeders and salt blocks were weighed to determine consumption. The collected data were used to calculate the previous week's intake of mineral. Consumed mineral was replaced and the distance to a water source was adjusted as needed to achieve 3 oz per head daily target. The movement and opening of mineral feeders were done in pasture blocks to ensure intake differences were due to mineral and not to human error.
All calves were inspected multiple times weekly throughout the trial for pinkeye, lameness, and other ailments. If diagnosed with foot rot or pinkeye, cattle received Bio-Myocin 200. Upon conclusion of the study, all steers were weighed and placed in a small pasture overnight before shipping to a commercial feedlot.
Data were analyzed using the MIXED procedure (version 9.4, SAS Inst. Inc., Cary, NC). Data were arranged in a randomized incomplete block design, with pasture serving as the experimental unit for growth and health outcomes as impacted by treatment. In the model, the fixed effects were treatment and pasture while the random effects were pasture × treatment, pasture, and animal identification.

Results and Discussion
There were no statistical interactions; therefore, only the main effects of mineral supplementation and implant are presented in Tables 2 and 3, respectively. There were no significant differences in average daily gain (P=0.40) from salt or mineral supplementation (Table 2). It should be noted that all steers in this study were previously used in a 90-day growing study with diets well-fortified with macro and trace minerals that may have influenced the results observed. The results in Table 4 show the forage quality at three time points of the experimental pastures during the course of the experiment. Overall, the crude protein levels in the complete salt and mineral supplement pastures were significantly higher (P<0.02) than the other two treatments.
Compared to non-implants, calves gained significantly faster when implanted with either Revalor-G or Ralgro (average of 8.9%, P=0.02). The average block salt intake was approximately 1.43 oz/head daily while the daily intake of the K-State complete mineral was 3.3 oz/head. Salt and complete mineral intakes of the stockers were fairly consistent throughout the 13-week trial, with greater usage rates associated with periods of high precipitation (Figure 1). Two calves were pulled or treated during the course of the trial; one for a broken leg and the other for lameness.

Implications
While there was no growth response to salt block and injectable trace mineral supplementation when compared to a complete mineral supplementation, there was a significant growth response with growth implants.

Introduction
Intensive-early grazing of yearling beef steers or heifers in Kansas allows beef producers to maximize growth and production per acre during the period of the year associated with elevated forage quality in the Flint Hills. The noxious weed sericea lespedeza (Lespedeza cuneata) threatens this long-used and successful production practice. Cattle are highly sensitive to condensed tannins in sericea lespedeza. Experienced beef cattle show strong aversions to grazing sericea lespedeza. It is unknown to what degree that naive yearling cattle may select sericea lespedeza during normal grazing activities.
Microhistological analysis of feces has been used to estimate botanical composition of the diets of domesticated and wild herbivores. Plant fragments in feces are evaluated under a microscope to identify and count individual plant species. Our objective was to use microhistological analysis of feces to characterize the diets of yearling steers managed under an intensive-early stocking regime on native tallgrass pastures with significant infestations of sericea lespedeza.

Experimental Procedures
The experiment was conducted on eight native tallgrass pastures located in Woodson County, KS, at the Kansas State University Bessner Range Research Unit during the 2015 and 2016 growing seasons. Pastures were burned annually in April and stocked with yearling steers (n = 281/year; initial body weight = 582 ± 75 lb) at a relatively high stocking rate (2.7 acres/steer) from April 15 to July 15. Steers were sourced from various commercial cattle growers in southeastern Kansas and assigned randomly to pastures. Basal frequency of sericea lespedeza was 2.9 ± 2.43% during the period of our experiment.
Four 328-ft transects were laid out in a north-south gradient in each pasture. Following a 2-week adaptation period, fecal samples were collected bi-weekly from fresh fecal pats along each transect from May 1 to July 15 annually. Care was taken to avoid contaminating fecal material with soil or vegetation and samples were frozen at -4°F pending processing and analysis.
Fecal samples were dried in a forced-air oven at 131°F for 96 hours then finely ground. Following grinding, samples were composited by transect within sampling date and mixed for 120 minutes to achieve homogenization.
Individual subsamples (0.017 to 0.035 oz) of fecal composites and comparison standards were soaked in 50% ethanol solution overnight, then washed for 5 minutes with de-ionized water through a No. 200 US-standard sieve. Samples were then soaked in 0.05 M sodium hydroxide for 20 minutes and washed again with de-ionized water. A small amount of each sample or standard was placed on a microscope slide (5 slides/ fecal sample and 3 slides/standard sample) with a dissecting needle. One to 3 drops of Hertwig's solution was applied to each slide, and then slides were held over a propane flame until dry.
Slides were viewed under a compound microscope equipped with a digital camera (DC5-163, Thermo Fisher Scientific, Asheville, NC) at 100 × magnification. Twenty slide fields from each slide were randomly selected, photographed, and stored. Individual plant fragments on each sample-slide field of view were counted and identified by plant species. The total number of fragments of each plant species on a given slide were converted to frequency of occurrence using the following equation: (total of individual species ÷ total of all species) × 100. Plant fragments not similar to the 17 range plant standards were classified as either unknown forb or unknown grass.

Results and Discussion
Selection of individual plant species by steer was not influenced (P≥0.09) by pasture or by the interaction between pasture and time of collection (i.e., period); therefore, period sums of squares were partitioned using orthogonal polynomial contrasts ( Table  1). The proportions of total grass and grass-like plants and total forbs and forb-like plants in the diets of grazing steers were not different (P=0.37) between sampling periods and were interpreted to indicate steer diets were strongly dominated by grasses (≥ 88.4% of diets). More than 99% of grass plants selected by steers were represented by the 8 grass reference standards, whereas more than 97% of forbs selected by steers were represented by the 9 forb reference standards.
Steer selection of big bluestem, common ragweed, western ragweed, sericea lespedeza, dotted gayfeather, annual broomweed, and heath aster were also not influenced (P≥0.07) by sampling period. Conversely, steer selection of little bluestem decreased (P<0.01) linearly with advancing season, whereas selection of switchgrass, indiangrass, and Baldwin's ironweed increased (P≤0.04) linearly with advancing season. Proportion of all other reference plants in steer diets were influenced (P≤0.05) by sample collection period, likely affected by plant availability, plant growth cycle, or plant palatability. The dietary proportion of sericea lespedeza selected by freely-grazing yearling beef steers was small (trace amounts to 0.1% of the diet). Naive steers evidently learned to avoid sericea lespedeza early during the grazing period.

Implications
In this experiment, grasses comprised not less than 88.4% of steer diets, whereas forbs comprised not more than 11.6% of steer diets. Selection of grasses by steers was somewhat greater than that reportedly selected by beef cows at the same time of year. Major grasses in steer diets were big and little bluestem, switchgrass, and indiangrass. Sericea lespedeza comprised only a minor proportion of grazing steer diets. In fact, sericea lespedeza consumption was less than the detection threshold of our analyses in 4 out of 5 sampling periods. This finding highlights the difficulty in achieving control over sericea lespedeza using yearling cattle grazing alone. Microhistological analysis of feces allows characterization of herbivore diets by identifying plant fragments using a microscope. These plant fragments are counted and used to calculate frequency of individual plant species in the diet. Our objective was to characterize the diets of sheep that were grazing sericea lespedeza-infested pastures during the growing season, following intensive-early grazing of yearling steers. A similar analysis to characterize the diets of stocker cattle appears in this report.

Experimental Procedures
The study was conducted at the Kansas State University Bessner Range Research Unit, located in Woodson County, KS, during the summers of 2015 and 2016. Eight native tallgrass pastures were burned annually in April. Pastures were infested with sericea lespedeza (basal frequency = 2.9 ± 2.43%). Pastures were stocked with yearling steers at a relatively high stocking rate (2.7 acres/steer) from April 15 to July 15, prior to sheep introduction.
Approximately 95% of above-ground biomass on pastures used in our study was composed of the following forage species: big bluestem, little bluestem, switchgrass, indiangrass, blue grama, sideoats grama, buffalograss, sedge spp., purple prairie clover, leadplant, dotted gayfeather, heath aster, sericea lespedeza, Baldwin's ironweed, western ragweed, annual broomweed, and common ragweed. These 17 dominant plant species were collected for use as comparison standards for microhistological analyses. Each standard sample was derived by hand-clipping 10 to 20 individual plants from a homogeneous stand of each plant type; samples were dried in a forced-air oven (131°F; 96 hours) and finely ground using a cyclone-style sample mill.
Mature ewes (n = 813 ± 0.46/year; initial body weight = 141 ± 9 lb) were obtained from 2 commercial sheep ranches located in western Kansas. Ewes were assigned randomly to graze 1 of 4 assigned pastures (0.5 acres/ewe); remaining pastures were not grazed from August 1 to October 1. Sheep were allowed a 2-week adaptation period before fecal samples were collected from 25 randomly-selected individuals on August 15. Collections from the same individuals were repeated on September 15 (i.e., 2 sampling dates). Wet fecal samples were placed into a plastic resealable bag and frozen (-4°F) pending further analysis. Pasture treatment assignments were fixed for the 2-year duration of the study.
Individual fecal samples were dried in a forced-air oven (131°F; 96 hours) then finely ground in a hammer mill. Next, 0.035 oz of each fecal sample and each comparison standard was soaked in 50% ethanol solution overnight, then washed for 5 minutes with de-ionized water through a No. 200 US-standard sieve. Samples were then soaked in 0.05 M sodium hydroxide for 20 minutes and washed again with de-ionized water through the sieve. Samples were placed onto microscope slides (5 slides/fecal sample and 3 slides/standard sample) with a dissecting needle. Two to 3 drops of Hertwig's solution were applied to slides which were then held over a propane flame until dry.
Slides were viewed under a compound microscope equipped with a digital camera (DC5-163, Thermo Fisher Scientific, Asheville, NC) at 100 × magnification. Twenty slide fields were randomly selected from each sample and each standard, photographed, and stored. Plant fragments in each photograph were individually counted and identified. The total number of occurrences of each plant species on a given slide were converted to frequency of occurrence ([total of individual species ÷ total of all species] × 100).

Results and Discussion
Selection of individual plant species by sheep was not influenced (P≥0.23) by pasture or by the interaction between pasture and month of collection (i.e., period); therefore, diet selection was reported by period (Table 1). Grass species constituted 57.4% and 58.4% of sheep diets during August and September, respectively. Major grass species in the diet included little bluestem and big bluestem. More than 99% of grass plants selected by sheep were represented by the 8 reference standards. Collection month did not influence (P≥0.12) selection of big bluestem, little bluestem, indiangrass, blue grama, sideoats grama, sedges, or unidentified grasses. In contrast, selection of buffalograss by sheep increased (P<0.01) from August to September, whereas selection of switchgrass by sheep tended (P=0.06) to decrease from August to September.
Forb species accounted for 42.6% and 41.6% of sheep diets in August and September, respectively (Table 1). Major forbs in the diet included purple prairieclover and Baldwin's ironweed. More than 99% of forbs selected by sheep were represented by the 9 reference standards. In addition, no differences (P≥0.19) between collection months were observed for any of the 9-forb reference standard. In contrast, the number of unidentified forb fragments decreased (P=0.04) August to September. Sheep selected 1.5% sericea lespedeza in August and 1.6% sericea lespedeza in September; in a previous publication, we reported that this level of consumption was associated with significant depression in seed production by sericea lespedeza.
In this experiment, grasses comprised 57 to 58% of sheep diets, whereas forbs comprised 43 to 42% of sheep diets. Selection of grasses by sheep was slightly greater than that reported in previous research. Major grasses in sheep diets were big and little bluestem and major forbs in sheep diets were purple prairie clover and Baldwin's ironweed. Sericea lespedeza comprised approximately 1.5% of sheep diets. Consumption at that level is likely sufficient to control seed production by sericea lespedeza. Utilization of sheep in a grazing system with cattle may provide control of sericea lespedeza, whereas cattle grazing alone is not useful for sericea lespedeza control.

Introduction
Programming more efficient gains early in the feeding period is possible when utilizing limit-fed high-energy diets based on fermentable corn by-products such as wet corn gluten feed. The replacement of corn in the diet with corn by-products decreases the health risks associated with rapid starch digestion while maintaining total energy intakes sufficient for desired gains. Evidence from recent trials conducted at the Beef Stocker Unit supports this rationale; however, to our knowledge, research has not been conducted analyzing the effects on performance when one high-energy diet is limit-fed at increasing levels of controlled intake. If efficiency is maintained and average daily gain increases with increasing intake, then one diet could be used to achieve a variety of gains.

Results and Discussion
Performance results from the experiment are in Table 2. Final average daily gain increased linearly (P<0.01) with increasing intake. These results were expected and can be attributed to the overall increase in energy intake as more feed was offered. In addition, final dry matter intake increased linearly (P<0.01) by design of the experiment, as intake was the treatment administered. Most interestingly, final feed to gain ratio at the conclusion of the study was not affected by dietary treatment (P=0.98). We speculated that with intakes at 2.8% of body weight, the benefit in efficiency often realized when limit-feeding would be diminished, as a theoretical ad libitum intake would be similar to this value given the characteristics of the cattle. Because gain increased linearly, while efficiency remained acceptable, a new approach to programming gain early in the growing period is possible. Based on these results, producers could potentially offer one diet from entry to the feedlot through the growing phase without the inconvenience and conflicts associated with step-ups and/or diet manipulation. However, it is important to note that because a diet was offered for ad libitum intakes at the end of the study, compensatory gain from the more limited cattle could have affected results. In addition, the overall high concentrate level of the 60 Mcal net energy for gain/100 lb dry matter diet could decrease the number of step-up diets needed for adaptation to a finishing diet. In conclusion, programming a variety of gains in the receiving and growing period is possible utilizing a single a high-energy diet based on wet corn gluten feed without sacrificing performance.

Introduction
Previous research has shown increasing energy in diets fed to newly received growing cattle could increase morbidity. It is thought the increased incidence and severity of subacute and acute ruminal acidosis could be the cause, as dietary energy is most often increased by the removal of dietary roughage and the addition of cereal grains that provide large amounts of starch. Because starch is rapidly fermented in the rumen, the excessive production of organic acids can result, ushering in a variety of metabolic disorders all potentially contributing to increased morbidity. Recently, research conducted at the Kansas State University Beef Stocker Unit has demonstrated limit-feeding high-energy diets based primarily on corn by-products such as wet corn gluten feed could offer a more efficient approach to feeding newly received growing cattle without adversely affecting health. The overall diet digestibility and characteristics of digestion involving limit-fed diets based primarily on wet corn gluten feed has not been extensively studied.

Experimental Procedures
Six ruminally-cannulated Jersey crossbred steers were used to determine diet digestibility and characteristics of digestion in limit-fed diets based primarily on wet corn gluten feed. Experimental diets were formulated to provide 45, 50, 55, and 60 Mcal net energy for gain/100 lb dry matter and were offered at 100%, 95%, 90%, and 85% of a predetermined ad libitum intake, respectively (Table 1). Two weeks before initiation of the trial, all animals were offered the 45/100 treatment to establish an ad libitum intake value. When the trial began, this value was used to dictate feed delivery of the treatments previously described. The trial consisted of 4 consecutive 15-day periods each containing 10 days of diet adaptation, 4 days fecal sampling, and 1 day of ruminal sampling. On days 4-14, chromium oxide was top-dressed and hand-mixed into the ration to be used as an indigestible marker for diet digestibility. Fecal samples were collected from the rectum of the steers on days 11-14 with 3 samples collected each day and the sampling time advancing by 2 hours each day such that every 2 hours increment post-feeding was represented. Fecal samples were composited for each steer within each period. The concentration of chromium oxide in feces was used to determine diet digestibility. On day 15, immediately before feeding and after the 0 hour sampling time, cobalt ethylene diamine tetraacetic acid was dosed through the cannula to be used as a marker for liquid dilution rate. Ruminal contents were collected and strained at 0, 2, 4, 6, 8, 12, 18, and 24 hours post-feeding. Ruminal fluid was analyzed for concentrations of volatile fatty acids, cobalt from cobalt ethylene diamine tetraacetic acid, and ammonia. In addition, an indwelling bolus (SmaXtec; Graz, Austria) inserted through the ruminal cannula at initiation of the trial was used to continuously monitor ruminal pH.

Results and Discussion
Results from the trial are in Tables 2 and 3. Ruminal proportions of propionate increased with increasing energy level and decreasing intake (P<0.01) in the limit-fed rations most likely as a result of the increased corn, thus ruminal starch fermentation. This is important because propionate is thought to be the most energy yielding volatile fatty acid produced during ruminal fermentation. Ruminal ammonia concentrations increased linearly (P<0.01) with increasing energy and decreasing intake. Because cattle on the higher energy diets were being limit-fed, a large proportion, if not all, of the feed offered was consumed in one visit to the bunk, leading to increased levels of ammonia being released during microbial digestion. Total tract diet digestibility also increased linearly (P<0.01) with increasing energy and decreasing intake as a result of two very important drivers of diet digestibility. First, increased dietary energy was achieved in the limit-fed diets by the removal of roughage and the addition of more digestible ingredients like dry-rolled corn. Secondly, passage rate through the gastrointestinal tract is a function of intake. When intake is restricted, passage rate slows, and digestibility usually increases as a result of the increased residence time in the tract. This concept was verified as liquid dilution rate decreased linearly (P<0.01) with increasing energy and decreasing intake. When compared to the high roughage diet offered for ad libitum intake (45/100), the highest energy, most restricted diet (60/85) was 15% more digestible. Ruminal pH decreased linearly with increasing energy and decreasing intake (P<0.01) but was quick to recover and never decreased to levels likely associated with acute ruminal acidosis ( Figure 1).

Implications
High-energy limit-fed diets based primarily on wet corn gluten feed are 15% more digestible and produce a higher concentration of energy yielding volatile fatty acids compared to low-energy full-fed diets without apparent disruptions in ruminal fermentation or health.    Figure 1. Ruminal pH measured continuously over 24 hours after feeding using indwelling pH monitoring bolus (SmaXtec; Graz, Austria).

Introduction
More efficient gains could be achieved early in the growing period by limit-feeding high-energy diets based primarily on corn by-products. Research conducted at the Kansas State University Beef Stocker Unit leading up to this research trial has been carried out using only wet corn gluten feed as the corn by-product. While results have been positive, wet corn gluten feed is a commodity not available to all producers throughout Kansas, mostly because of the financial burden associated with transport.
Our goal was to analyze the effects of utilizing a high-energy limit-fed diet containing wet distiller's grains (a much more accessible commodity) compared to the traditional diet we had been feeding based on wet corn gluten feed (Sweet Bran, (Cargill, Blair, NE)). In addition, to our knowledge, research had not been conducted exploring the effects of dry-rolled or whole shelled corn processing in this particular feeding strategy.

Experimental Procedures
A total of 320 crossbred steers were purchased from a single source and shipped from 2 locations to the Kansas State University Beef Stocker Unit, Manhattan, KS, over a 2-day period from February 15 to 16, 2017. Two loads were shipped from Groesbeck, TX, (590 mi) and 2 loads from Hatch, NM, (886 mi) and used to determine the effects of by-product (wet corn gluten feed or wet distiller's grains with solubles) and extent of corn processing (whole shelled or dry-rolled corn) in a randomized complete block design with a 2 × 2 factorial arrangement of treatments. All 4 diet combinations were formulated to provide 60 Mcal net energy for gain/100 lb dry matter and contain 40% of their respective by-product (Table 1). On arrival, calves were weighed and assigned by body weight to pens, which were randomly assigned to dietary treatment. There were 10 steers per pen and 8 pens per treatment combination for a total of 32 pens. Animals were weighed individually on days -1, 0, 14, and 70. A pen scale was used to measure weights on days 7, 21, 35, 42, 49, 56, and 63. After weighing, diet delivery was adjusted by pen each week such that 2.0% of body weight on dry matter basis was offered for all treatments daily, targeting 2 lb/day average daily gain. All pens were fed daily at 7:00 a.m. Performance was calculated from day -1 to days 14, 28, 42, 56, and 70 and pen was the experimental unit. Approximately 12 hours before weighing on day 70, water was restricted overnight to more accurately estimate biological gain. Because day -1 weights were measured directly off the truck, and feeding occurred once daily, the goal was to have two shrunk weights to calculate performance.

Results and Discussion
There were no animals treated for respiratory disease throughout this trial, indicating the presence of healthy cattle. Performance data are in Table 2. There was a by-product × corn processing interaction for final dry matter intake (P<0.01). This result was unexpected as all pens were meant to receive the same amount of feed on a dry matter basis. Analysis conducted after the trial was completed indicated the wet corn gluten feed diets were dryer than initially thought, thus more dry matter was being delivered. Despite the difference in dry matter intake, neither by-product, corn processing, nor their interaction, affected average daily gain or efficiency of gain (P≥0.38). Results from this trial indicate high-energy diets formulated with wet corn gluten feed or wet distiller's grains with solubles containing either dry-rolled or whole corn can be offered at 2% of body weight on a dry matter basis and achieve similar gains. This is important because producers have the choice of by-product and whether or not to process corn without their decision negatively affecting performance in this novel programmed-feeding strategy.

Introduction
Recent studies have been conducted to evaluate the alpha amylase enzyme expression trait in Enogen Feed corn on finishing beef cattle performance. The results suggest various but predominantly positive outcomes in feed efficiency when fed with either corn gluten feed (Sweet Bran) or modified wet distillers grains. Supporting metabolism work has determined that cattle fed Enogen Feed corn are able to utilize more starch from corn containing this trait, which provides them with more energy. It is not known if a similar response will be observed with growing cattle. Therefore, the objective of this study was to evaluate the health and performance of growing cattle when fed Enogen Feed corn.

Experimental Procedures
English crossbred steers (n=426), averaging 538 lb, were transported from a single source in Lazbuddie, TX, to the Kansas State University Beef Stocker Unit (approximately 565 mi), Manhattan, KS, on May 15, 2017. Upon arrival, all animals were vaccinated for viral and clostridial diseases and treated for internal and external parasites. On day 21, all research animals were revaccinated for respiratory diseases. Thirty-two pens were used (8 for each treatment), composed of 12 animals each. Thirty-two steers on the lower end of the weight spectrum and 10 steers on the higher end of the weight spectrum were removed from the research population. The remaining 384 steers were stratified by weight and randomly assigned to pens, which were randomly allocated to 1 of 4 treatments. The four treatment diets were formulated to provide 51 Mcal net energy for gain/100 lb dry matter and all were offered ad libitum intakes. The experiment was a 2 × 2 factorial design with two varieties of corn (Enogen Feed (Syngenta) vs. yellow) and two methods of corn processing (dry-rolled vs. whole-shelled). Pen was the experimental unit. The steers were fed their respective diets once daily at approximately 7:00 a.m. for 90 days. Individual animal weights were taken on days -1 (arrival), 0 (initial processing), 21 (revaccination), and 91 (final weights). Fecal starch samples were obtained individually on days 56 and 57 and analyzed the same week. Pen weights were collected on days 7, 14, 35, 63, and 77. Feed delivery was adjusted based on daily refusals to ensure ad libitum intakes without an excess of leftover feed. Bunk and individual ingredient samples were taken weekly.

Results and Discussion
During the entire 90-day trial, the dry matter intake for calves fed Enogen Feed corn tended to be lower (P<0.09) than for calves fed yellow corn (Table 2). This difference was especially apparent through day 14, where calves consumed more yellow corn feed than their Enogen-fed counterparts (P<0.01). Average daily gain and off-test weights tended to be greater (P<0.10) for calves fed Enogen Feed corn over the entire 90-day trial. Feed efficiency was greater in calves fed Enogen Feed corn (P<0.01). As early as day 35, feed efficiency tended to be greater for Enogen-fed calves (P<0.07). For the remainder of the study (days 63 to 90), feed efficiency was significantly greater for calves fed Enogen Feed corn (P<0.01). Overall, the feed efficiency of calves receiving Enogen Feed corn was improved by 5.50%. By using a variety of corn that provides the alpha amylase expression trait, producers have the opportunity to capture the value of improved feed efficiency.

Implications
When fed in an ad libitum fashion to growing calves, Enogen Feed corn improves the efficiency of feed conversion by 5.50%. This response became apparent by day 35 and was a significant factor throughout the remainder of the study. There were no negative observations regarding cattle health or behavior with the feeding of Enogen Feed corn.

Introduction
Inflammation, stress levels, and overall suppression of the immune system have been documented in cattle experiencing complications from ruminal acidosis. Subacute and acute ruminal acidosis are most often caused by excessive fermentation of readily fermentable carbohydrates, the most common being starch. Metabolic and pathological issues associated with acidosis have limited the inclusion of starch in receiving diet formulation thereby limiting dietary energy in most cases. The use of by-products in limit-fed diets have made it possible to increase energy beyond a typical receiving/ growing ration providing better performance. The specific effects such a feeding strategy may have on the overall health of the animal have not been extensively studied, to our knowledge.

Experimental Procedures
A total of 354 crossbred heifers (body weight = 478 lb ± 9) were purchased at auction markets in Alabama and Tennessee, assembled at an order buyer's facility in Dickson, TN, then shipped 675 mi to the Kansas State University Beef Stocker Unit, Manhattan, KS, over a 10-day period from May 24 to June 3, 2016. The heifers were used in a randomized complete block design to analyze the effects of 4 energy levels and intakes of fibrous by-product-based diets on health and performance of stocker cattle in a 55-day receiving and growing study. Calves were blocked by load (4), stratified by individual arrival weight within load and assigned to pens containing 11 or 12 heifers. Pens within each block were randomly assigned to 1 of 4 treatments that equaled 8 pens/treatment for a total of 32 pens. Experimental diets were formulated to provide 45, 50, 55, or 60 Mcal net energy for gain/100 lb dry matter and were offered for ad libitum intake (45/100), 95 (50/95), 90 (55/90), or 85% (60/85) of ad libitum intakes (Table 1). All diets were formulated to contain 40% wet corn gluten feed (Sweet Bran; Cargill Animal Nutrition, Blair, NE) on a dry matter basis.
Thirty-two animals from each dietary treatment (4 from each pen) were randomly selected after arrival (day -1) and bled via a tail vein to serve as a subset for analysis of antibody production toward vaccines and the acute phase protein haptoglobin. Blood was collected via a tail vein on days 0, 14, and 27 using venipuncture. Samples were shipped to the Kansas State University Veterinary Diagnostic Laboratory, Manhattan, KS, and analyzed for antibody titers for bovine viral diarrhea I and II and infectious bovine rhinotracheitis as well as haptoglobin. Animals removed from the pen according to the protocol for illness were also bled via a tail vein and the blood sample handled identically to the samples taken from the subset of cattle. In addition, a predetermined random order of animals from each pen was generated on day 0 that served as a means to select a healthy control animal from each pen to obtain a blood sample following the same protocol to use for pairwise comparisons. Animals that became morbid were permanently removed from the list of healthy candidates and therefore could never serve as a "healthy" animal for comparison. Two randomly selected animals from each pen (16/dietary treatment) were also used to determine fecal cortisol metabolite as a means of quantifying stress levels. Fecal grab samples were obtained from the rectum of each of the selected animals on days 0 and 14 of processing. Samples were labeled by individual animal identification number and immediately frozen at -20°C for analysis. All fecal samples were shipped to the K-State Veterinary Diagnostic Laboratory to determine fecal cortisol metabolite concentrations.

Results and Discussion
Serological results from the subset of cattle sampled from each treatment are in Table 2. There were no diet or diet × day interactions for titer level production toward viruses, haptoglobin, or fecal cortisol metabolite excretion (P≥0.23 and P≥0.21, respectively).
Haptoglobin increased for all dietary treatments between days 0 and 14 and decreased from days 14 to 27 (quadratic, P<0.01). Titer levels for the viruses increased linearly from days 0 to 27 (P<0.0001) and fecal cortisol metabolite was higher on arrival than on day 14, most likely due to stress of procurement and transport (P<0.01).
Dietary treatment or the interaction of dietary treatment and health status had no effect on any of the parameters measured. Haptoglobin was higher overall in morbid animals compared to healthy animals as a result of increased inflammation with morbidity ( Figure 1; P<0.05). Titer levels for bovine viral diarrhea I and infectious bovine rhinotracheitis were higher in healthy animals compared to morbid pen mates (P<0.05). Bovine viral diarrhea II titers were not affected by health status (P>0.10).

Implications
Increased dietary energy from wet corn gluten feed in limit-fed receiving diets does not increase stress, inflammation as measured by haptoglobin, or immune function measured by titer levels to significant industry viruses. However, morbid animals demonstrated increased haptoglobin and decreased titers compared to healthy pen mates, independent of dietary treatment.   Unlike superscripts above bars in chart differ (P<0.05).

Leucine Supplementation Did Not Improve Protein Deposition or Lysine Utilization in Growing Steers
Introduction Amino acids are used for a wide variety of biological processes, one of which is to build muscle. Though there are requirements for each amino acid, the requirement for optimal efficiency to meet the animals' genetic potential has yet to be identified. Protein can be the most expensive portion of the diet. By feeding the optimal amount of limiting amino acids, producers may be able to better feed animals to their genetic potential while saving money by not feeding excess protein. In addition, by determining the proper requirements of these amino acids, producers may be able to reduce nitrogen losses in urine and feces by increasing protein deposition by the animal. Lysine is generally one of the most limiting amino acids in cattle, making its delivery important. In addition, there is evidence that leucine is a key regulator of protein synthesis, and therefore it might increase performance by playing a regulatory role in muscle. Thus, providing leucine in amounts greater than the animal's requirement has the potential to increase growth and improve the efficiency of lysine utilization. This study was conducted to determine if leucine supplementation could improve protein deposition and lysine utilization in growing steers.

Experimental Procedures
Seven ruminally cannulated Holstein steers (initial body weight 380 ± 8.1 lb) were used in a 6 × 6 Latin square design and randomly assigned to one of 6 treatments in each of 6 periods. Each treatment period lasted one week, for a total of 6 weeks. Treatments consisted of 2 levels of lysine (0 or 0.212 oz/day) and 3 levels of leucine (0, 0.529, or 1.058 oz/day) infused abomasally; all 6 possible combinations of these two factors were tested. All steers received abomasal infusions of all essential amino acids, except lysine, to ensure that lysine was the only limiting amino acid. In addition, all steers received constant ruminal infusions of 12.35 oz/day of volatile fatty acids and abomasal infusions of 10.58 oz/day of glucose to provide additional energy to the steers.
The basal supply of leucine was designed to be adequate to meet the animals' needs for protein deposition, and supplemental infusions of leucine were used to investigate the regulatory effects of leucine on protein deposition. Leucine, in some animal models, is able to regulate protein synthesis by intracellular signaling pathways.
Steers were housed in metabolism crates in an environmentally-controlled room to allow for complete urine and fecal collection, which allowed measurement of nitrogen retention (an estimate of protein deposition). Feed (Table 1) was delivered twice daily at 6.2 lb/day, with the remaining nutrients being infused ruminally and abomasally. This was accomplished using a central pump to infuse nutrients into the rumen cannula with one infusion line placed in the abomasum and the other in the rumen. Treatments were constantly infused throughout the study. The first two days of each period were used for treatment adaptation. Diet samples were collected on days 2 through 5, and fecal and urine outputs from days 3 through 6 of each period were collected. Samples were frozen until time of analysis, and each sample was analyzed for nitrogen content. Nitrogen retention was calculated as the difference between total nitrogen input and total nitrogen output. Data were analyzed using the MIXED procedure of SAS version 9.4 (SAS Inst. Inc., Cary, NC) as a 6 × 6 Latin square design. Treatment, treatment interactions, and period were all considered fixed effects. Animal was considered to be the random effect. Significance was declared at a P-value less than 0.05.

Results and Discussion
Nitrogen retention increased when supplemental lysine was infused postruminally (P<0.0001, Figure 1). This response was expected because the model was designed to be limiting in lysine. No differences were observed in response to leucine infusion (P=0.46), and no interactions between leucine and lysine treatments were observed (P=0.72). These data suggest that, at the levels included in this study, leucine had no effect on lysine utilization in terms of nitrogen retention. However, in conditions in which lysine is not limiting, it is possible that leucine could yield a benefit.
Plasma concentrations of lysine increased (P<0.0001, Figure 2) with lysine supplementation, and leucine increased linearly (P<0.0001, Figure 3) with postruminal supplementation of leucine. This suggests that the model was successful in delivering these amino acids to the steers.

Implications
Supplementing growing cattle with post-ruminal leucine had no effect on nitrogen retention when lysine was limiting. However, supplemental lysine increased nitrogen retention when 0.212 oz/day was supplemented.  Figure 1. Nitrogen retention in response to postruminal lysine and leucine supplementation in growing steers fed a lysine-limiting diet. Nitrogen retention increased (P<0.0001) when lysine was increased from 0 to 0.212 oz/day, but no differences (P=0.46) were observed in response to supplemental leucine.

Introduction
Tenderness, juiciness, and flavor contribute to a consumer's overall experience of eating beef. These sensory traits can be detrimentally affected by increased cooking temperature (Lorenzen et al., 2003). As degree of doneness, or final internal temperature, increases consumer sensory scores for palatability traits decrease, leading to a decrease in overall eating satisfaction. However, an increase in the amount of marbling can lead to an increase in consumer sensory scores for tenderness, juiciness, and flavor-as well as overall eating satisfaction (O'Quinn et al., 2012;Lucherk et al., 2016). One theory proposes that increased marbling can counteract the negative effect of increased cooking temperatures. This theory is known as the insurance theory, as marbling acts as "insurance," to maintain an acceptable eating experience as degree of doneness increases (Savell and Cross, 1988). To our knowledge there is limited research evaluating this theory, but potential verification could benefit the industry by better identifying products that will meet consumer eating expectations based on their preferred degree of doneness. Therefore, the objective of this study was to determine whether increased marbling reduces the negative impact that increased degree of doneness has on consumer palatability scores.

Experimental Procedures
Paired beef strip loins (Institutional Meat Purchase Specifications #180) were collected from four U.S. Department of Agriculture quality grades [Prime, Top Choice (Modest and Moderate marbling), Low Choice, and Select; n = 12 pairs/quality grade]. An additional 12 pairs of USDA Select strip loins were collected for moisture enhancement. Subprimals were aged for 21 days in vacuum packages at 39.2°F. At the end of the aging period, strip loins designated for enhancement were enhanced at an 8% pump-level with a salt and alkaline phosphate solution using a multi-needle injector (Model N50; Schröder Maschinenbau GmbH, Werther, Germany). Strip loins were fabricated into 1-in steaks, with three consecutive steaks being grouped with a total of three groups per strip loin, and a total of six groups per strip loin pair. Within each strip loin pair, groups were randomly assigned one of six degrees of doneness: very-rare (130°F), rare (140°F), medium-rare (145°F), medium (160°F), well-done (170°F), or very well-done (180°F) so that each carcass had representation for each degree of doneness. Steaks were cooked to their designated degree of doneness on a clamshell grill (Cuisinart Griddler Deluxe, Model GR-150, East Windsor, NJ) with temperatures monitored using a probe thermometer (Super-Fast Thermopen, ThermoWorks, American Fork, UT). Consumers (n = 360) were served eight samples representing differences in quality grade and degree of doneness in a random order. Consumers in individual sensory booths under red incandescent lighting evaluated each sample for juiciness, tenderness, flavor, and overall liking on continuous 100-point line scales. Additionally, consumers rated each trait as either acceptable or unacceptable.
Least squares means for consumer sensory scores are shown in Table 1. There was no interaction (P>0.05) between quality treatment and degree of doneness for all sensory traits evaluated. For the main effect of quality treatment, Select Enhanced had the highest consumer ratings (P<0.05) for juiciness, tenderness, flavor, and overall liking followed by Prime, which was higher (P<0.05) than all other lower grading samples. Top Choice and Low Choice had similar consumer ratings (P>0.05) for all sensory traits evaluated. Select steaks had the lowest (P<0.05) ratings for juiciness, flavor and overall liking and were lower (P<0.05) for tenderness than all grades other than Low Choice. When evaluating the impact of degree of doneness on consumer ratings, very well-done steaks had the lowest scores (P<0.05) for all traits evaluated except for overall liking, which were similar (P>0.05) to well-done. When evaluating steaks for juiciness, very rare steaks were rated higher (P<0.05) than all other treatments, except for rare, which were similar (P>0.05). Very-rare, rare, and medium-rare steaks had similar (P>0.05) consumer ratings for tenderness, flavor, and overall liking. Figure 1 shows the interaction (P<0.05) between quality grade and degree of doneness for the percentage of steaks rated acceptable for juiciness. There were differences (P>0.05) among quality treatments within very-rare for juiciness acceptability. However, Select Enhanced had a higher (P<0.05) percentage of steaks rated acceptable than all other treatments at medium and above, being similar (P>0.05) to only Prime at very well-done. Select steaks had the lowest (P<0.05) percentage of steaks rated acceptable for juiciness at rare and medium, and were similar (P>0.05) to only Low Choice and Top Choice at well-done and very well-done. Prime had a similar (P>0.05) percentage of steaks rated acceptable for juiciness as Top Choice and Low Choice at all degrees of doneness, except when cooked to very well-done. Prime had a higher (P<0.05) percentage than all non-enhanced treatments, including Top Choice and Low Choice, when cooked to very well-done. Top Choice had a similar (P>0.05) percentage of steaks rated acceptable for juiciness as Low Choice across all degrees of doneness.
For consumer palatability scores, the negative impact of increasing final internal cooking temperature on juiciness, tenderness, flavor, and overall liking was the same across all quality treatments, as indicated by the lack of a significant quality grade × degree of doneness interaction. The palatability scores generally increased as marbling increased, with the exception of Select Enhanced samples, which the consumers scored the highest for all sensory traits. These results would imply that the insurance theory is not valid, as an increase in marbling does not have any added benefit to reducing the negative impact of increasing degree of doneness in each individual palatability trait. However, when evaluating on an acceptability basis, an increase in marbling score does appear to modify the point at which a sample becomes unacceptable for juiciness. For each quality treatment, there appears to be a degree of doneness point where there is a sharp reduction in the percentage of steaks rated acceptable for juiciness. For Select and Low Choice samples, the decrease appears to occur immediately after medium-rare. That drop is not observed in Top Choice until after the medium degree of doneness. Enhanced samples were able to maintain acceptable juiciness until being cooked to very well-done, most likely due to their added moisture. However, Prime steaks were able to maintain a steady, slight decline in the percentage of samples rated acceptable across all degrees of doneness, and did not appear to have the same dramatic drop off in accept-ability due to increased degree of doneness observed in the other quality grades. Thus, the relationship between marbling and degree of doneness for juiciness acceptability does appear to be consistent with the "insurance theory."

Implication
These results indicate that marbling could play a role in compensating for the negative effects advanced degrees of doneness on juiciness acceptability, providing insight into the quality grade needed for consumers to be satisfied with juiciness based on their preferred degree of doneness. Standard error (largest) of the least square means.

Introduction
In the beef industry, tenderness, juiciness, and flavor are associated with beef palatability and overall eating experience . The odds of overall palatability failing when tenderness is acceptable is 10%, whereas the odds of overall palatability failing when tenderness is unacceptable is 69% . The industry always strives to improve beef quality and palatability; thus, it is important to study tenderness in order to improve consumer satisfaction. Myofibrillar fragmentation index is a measure of tenderness that has previously been associated with sensory tenderness ratings (Olson et al., 1977). However, to our knowledge, little research has evaluated myofibrillar fragmentation differences within and between muscles. Therefore, the objective of this study was to determine the correlation between the myofibrillar fragmentation index, Warner-Bratzler shear force, and sensory traits of strip loins (longissimus lumborum) and the eye of round (semitendinosus) steaks.

Experimental Procedures
Forty beef strip loins (Institutional Meat Purchase Specifications #180) and 40 eye of rounds (Institutional Meat Purchase Specifications #171C) were collected from a Midwest beef processor and transported to the Kansas State University Meats Laboratory, Manhattan, KS. Sub-primals were divided into anatomical location (anterior, medial, and posterior for longissiumus lumborum; proximal and distal for semitendinosus) and cut into three 1-in thick steaks and aged 14 days. Within location, steaks were randomly assigned to Warner-Bratzler shear force, trained sensory panel evaluation, or myofibrillar fragmentation index analysis. Steaks used for Warner-Bratzler shear force and the trained sensory panel review were cooked to an internal temperature of 160°F on electric clamshell grills (Cuisinart Griddler Deluxe, Cuisinart, East Windsor, NJ). Steaks used for Warner-Bratzler shear force were chilled overnight at 40°F, and six 0.4-in cores were removed parallel to the orientation of the muscle fiber and sheared once perpendicular to the muscle fiber orientation through the center using an Instron testing machine (Model 5569, Instron Corp., Norwood, MA) with a Warner-Bratzler shear head. Sensory panel steaks were cut into 0.5 × 0.5 × 1-in cubes and immediately served to sensory panelists trained following American Meat Science Association guidelines for sensory evaluation (2016). Myofibrillar fragmentation index was determined using procedures described by Culler et al. (1978). Data were analyzed as a completely randomized design with muscle as the fixed effect. Sub-primal location data were analyzed independent of muscle and as a completely randomized design with anatomical location as the fixed effect.
When comparing muscles, there were muscle differences (P<0.05) for all variables measured as shown in Table 1 (Table 2). Panelists rated anterior steaks higher (P<0.05) for myofibrillar and overall tenderness than middle and posterior steaks, which were not different (P>0.05) from each other. Panelists detected less (P <0.05) connective tissue in anterior steaks when compared to middle and posterior steaks, which were not different (P>0.05) from each other. In the semitendinosus, proximal steaks had higher (P<0.05) Warner-Bratzler shear force values and sensory connective tissue amounts than distal steaks (Table 3). Proximal steaks had less (P<0.05) myofibrillar and overall tenderness than distal steaks. Within each sub-primal, anatomical location had no effect (P>0.05) on myofibrillar fragmentation index value. Myofibrillar fragmentation index was correlated (P<0.05) to myofibrillar tenderness (r=-0.18), connective tissue (r=0.11), and overall tenderness (r=-0.15); however, myofibrillar fragmentation index was not correlated (P=0.056) to Warner-Bratzler shear force.

Implications
In longissimus lumborum and the semitendinosus, myofibrillar fragmentation index was not dependent upon anatomical location. Moreover, the correlation between myofibrillar fragmentation index, Warner-Bratzler shear force, and sensory measures of tenderness were weak, indicating myofibrillar fragmentation index was not a reliable indicator of beef tenderness for these muscles. Within a row, means without a common superscript differ (P<0.05). 1 0 = extremely dry, extremely dry, extremely tough, none, extremely tough, extremely unbeef-like, and extremely bland; 100 = extremely juicy, extremely juicy, extremely tender, abundant, extremely tender, extremely beef-like, and extremely intense. Within a row, means without a common superscript differ (P<0.05). 1 0 = extremely dry, extremely dry, extremely tough, none, extremely tough, extremely unbeef-like, extremely bland, and extremely bland; 100 = extremely juicy, extremely juicy, extremely tender, abundant, extremely tender, extremely beef-like, extremely intense, and extremely intense. Within a row, means without a common superscript differ (P<0.05). 1 0 = extremely dry, extremely dry, extremely tough, none, extremely tough, extremely unbeef-like, extremely bland, and extremely bland; 100 = extremely juicy, extremely juicy, extremely tender, abundant, extremely tender, extremely beef-like, extremely intense, and extremely intense.

Introduction
Currently, 119 different branded beef programs are governed by the U.S. Department of Agriculture (USDA)-Agricultural Marketing Service. While marbling texture is not officially considered when USDA quality grades (USDA, 1997) are determined, 75% of branded beef programs require carcasses to have fine or medium textured marbling to meet set standards . While there are a multitude of factors that can contribute to beef eating experience, marbling is often thought to play a key role.  reported fine marbled beef was more tender than coarser marbled beef, and proposed perimysial connective tissue as the likely cause for the observed difference. Aside from the extent of postmortem proteolysis and muscle fiber structure, collagen is a major influencer of tenderness (Koohmaraie et al., 2002). Differences in tenderness between muscles can occur, in part, due to background effects related to the amount of connective tissue and/or solubility of collagen . To date, the  theory has not been adequately evaluated. Therefore, the objective of this study was to determine the effects of marbling texture on collagen traits and adipocyte cross-sectional area.

Experimental Procedures
Beef strip loins (n = 117) from three USDA quality grades [Top Choice (Modest and Moderate marbling), Low Choice, and Select] and three marbling textures (fine, medium, and coarse), were selected using visual appraisal. To fit the criteria for one of three marbling textures, 75% of the marbling in the ribeye had to meet the USDA standard for fine, medium, or coarse textured marbling. After selection, strip loins were taken to the Kansas State University Meat Laboratory, Manhattan, KS, and fabricated into 1-in steaks. Four marbling flecks and the surrounding meat (0.6 in 3 ) were then taken from the medial, central, and lateral portion for adipocyte histochemical analysis. The remaining portion of the steak was vacuum packaged, aged 21 days, homogenized using a Waring blender (Waring Products Division; Hartford, CT), and stored at -112°F for collagen analysis. Each marbling fleck was cryosectioned and subjected to Masson's trichrome staining for perimysial collagen and adipocyte staining. Photomicrographs were taken and the cross-sectional area of a minimum of 200 adipocytes were measured and perimysium connective tissue thickness was measured every 10 µm ( Figure 1). Insoluble, soluble, and total collagen content was determined using the methods of . Perimysial collagen was extracted from the meat, freeze dried, and analyzed using a differential scanning calorimeter (Shimadzu Scientific Instruments, Kyoto, Japan) to determine peak melting temperature. Data were analyzed as a completely randomized design with a 3 × 3 factorial arrangement.

Results and Discussion
There were no marbling texture × quality grade interactions (P>0.05) for all traits studied. All three marbling textures (fine, medium, and coarse) contained a similar (P>0.05) amount of soluble and insoluble collagen (Table 1). Additionally, each marbling texture had a similar (P>0.05) amount of total collagen. All three quality grades had a similar (P>0.05) amount of soluble, insoluble, and total collagen.
Incidentally, a similar trend was shown when evaluating the effects of marbling texture and quality grade on perimysial thickness, and peak thermal transition temperature. Quality grade did not have an effect (P>0.05) on perimysial thickness. Furthermore, marbling texture had no effect (P>0.05) on peak thermal transition temperature.
Lastly, all quality grades contained a similar (P>0.05) perimysial thermal melting temperature, thus showing that quality grade had no effect (P>0.05) on peak thermal melting temperature.
Marbling texture and quality grade impacted (P<0.05) adipocyte cross sectional area ( Figure 2). Coarse marbled steaks contained larger (P<0.05) adipocytes than fine marbled steaks. Adipocytes of medium marbled steaks were similar (P>0.05) in size when compared to adipocytes of fine and coarse marbled steaks. While adipocytes of Top and Low Choice steaks were similar (P>0.05) in size, their adipocytes were larger (P<0.05) than adipocytes of Select steaks.

Implications
These results indicate that marbling texture has no effect on collagen traits and any potential tenderness differences among beef varying in marbling texture are not related to these traits. However, both quality grade and marbling texture category impacted adipocyte cross sectional area.  cross-sectional area and myosin heavy chain type distribution. A minimum of 3 photomicrographs were taken and a minimum of 300 fibers were analyzed per section.

Results and Discussion
Marbling texture did not impact (P>0.05) fiber cross-sectional area for any of the three myosin heavy chain (Type I, Type IIA, and Type IIX) isoforms (Figure 1). Moreover, marbling texture did affect (P<0.05) the distribution of the myosin heavy chain isoforms (Figure 1). Steaks with medium marbling texture displayed a greater (P<0.05) amount of Type IIA fibers than the fine and coarse marbled steaks. Fine and coarse marbled steaks showed similar (P>0.05) percentages of Type IIA fibers. In contrast, fine and coarse marbled steaks displayed a greater (P<0.05) percentage of myosin heavy chain type IIX fibers compared to medium marbled steaks. Fine and coarse marbled steaks were similar (P<0.05) for the percentage of myosin heavy chain type IIX fiber type. Quality grade (Top Choice, Low Choice, and Select) and marbling texture (fine, medium, and coarse) did not impact (P>0.05) fiber cross-sectional area ( Figure 2). Additionally, quality grade had no impact (P>0.05) on myosin heavy chain fiber type among Type IA, Type IIA, or Type IIX fiber distribution (Figure 2).

Implications
These results indicate that marbling texture does not impact muscle fiber cross-sectional area. Any potential difference in tenderness with varying marbling texture are not due to muscle fiber cross-sectional area or fiber type.    Figure 2. Least squares means of myosin heavy chain (MHC) a) cross-sectional area and b) fiber type distribution of beef strip loin steaks of varying quality grade treatments.

1
Top Choice = marbling score of Modest 00 to Moderate 100 .

Biological Variability and Chances of Error
Variability among individual animals in an experiment leads to problems in interpreting the results. Animals on treatment X may have higher average daily gains than those on treatment Y, but variability within treatments may indicate that differences in production between X and Y were not the result of treatment alone. Statistical analysis allows us to calculate the probability that such differences are from treatment rather than chance.
In some of the articles herein, you will see the notation P<0.05. That means the probability of the differences resulting from chance is less than 5%. If two averages are said to be significantly different, the probability is less than 5% that the difference is from chance, or the probability exceeds 95% that the difference resulted from the treatments applied.
Some papers report correlations or measures of the relationship between traits. The relationship may be positive (both traits tend to get larger or smaller together) or negative (as one trait gets larger, the other gets smaller). A perfect correlation is one (+1 or -1). If there is no relationship, the correlation is zero.
In other papers, you may see an average given as 2.5 ± 0.1. The 2.5 is the average; 0.1 is the standard error. The standard error is calculated to be 68% certain that the real average (with an unlimited number of animals) would fall within one standard error from the average, in this case between 2.4 and 2.6.
Using many animals per treatment, replicating treatments several times, and using uniform animals increase the probability of finding real differences when they exist. Statistical analysis allows more valid interpretation of the results, regardless of the number of animals. In all the research reported herein, statistical analyses are included to increase the confidence you can place in the results.