Skip to main content
Image of UT logo that reads The University of Texas at Austin
University Interscholastic League Logo
University Interscholastic League Logo
Leaguer Header Logo

THE OFFICIAL PUBLICATION OF
THE UNIVERSITY INTERSCHOLASTIC LEAGUE

Sight Reading Contest Committed to Consistency

By Richard Floyd, Music Director | Tuesday, May 03, 2011 11:03 AM

 The following article appeared in The Leaguer in April of 2004.  Based on recent occurrences relating to band sight-reading music, it would seem timely to run it once again. Please read on.

There are two things that I know will happen every spring. The first is the need to file my income tax return by April 15. That one is between Uncle Sam and me. The other is the inevitable, passionate discussion of sight-reading music that surfaces every year about this time. That one concerns all of us and I assure you that it is not an issue that I take lightly.

When I came to the UIL in 1984, my predecessor gave me a great deal of sage advice. As years have passed and times have changed, some of his council has lost relevance, but his assessment of the sight-reading contest continues to be remarkably accurate. Dr. Patrick assured me that regardless of the source of the music, the method used to select it or the system used to judge it, there would always be a portion of our directors who would be displeased. He was so right.

Does this mean that such complaints are merely idle chat and have no merit?  Or, that it is a “band director thing” and can be taken lightly?  Absolutely not!  There is always need for evaluation and refinement. If the majority of directors and administrators view the current process and music sources to be deficient, then we are obligated to refine the process or take the initiative to seek better options.

But as we embark on this discussion, I trust that the vast majority of music educators (if not all) will agree that sight-reading is important. It is one of the primary tools for measuring musical literacy. In one form or another, it is an integral component of the evaluation process in virtually every state that claims to have a comprehensive, performance-based music education program. If we are in agreement regarding the value of sight-reading, then we must persevere in our quest for the ultimate tool to measure this skill.

A second consideration must also factor into the debate. Does the current system have merit?  Apparently, many think so. In fact, as of this date, the UIL sight-reading music is being used in 23 states, including Virginia, Illinois, Georgia and Oklahoma. There is clear evidence that this project is providing a valued service to contest programs not only in Texas but also throughout the nation. Even so, we cannot ignore the concerns of those educators who feel the system lacks consistency.

So how does the system work at present? The structure of the timeline is as follows:

(1) In September, composers are commissioned to write music for each level. The criteria and other specifics are spelled out in detail. These composers come from a variety of sources. Many approach the UIL seeking the opportunity to write. They are encouraged to submit scores for screening prior to being extended a contract. Others are established, published composers who embrace the vision we hold for this project.

(2) The following March, each composer submits a first draft of the music. The music is perused for ranges, performance length and adherence to the criteria regarding time signature, key signature and rhythms. The state office conducts this initial review and the music is returned to the composers for revisions as required.

(3) A second, revised score containing recommended changes is submitted in May. This score is then sent to the five committee members who represent all performance levels. The committee members have 30 days to study these scores and submit their approval or recommendations for additional modifications.

(4) The composers then make a second round of revisions based on the recommendations of the committee members.

(5) Finally, the committee meets during the Texas Bandmasters Association Convention to collectively give the scores one final review. Consideration is given at this time to such issues as page turns, score format, clarity of parts and other production issues. Additional revisions may be submitted if necessary.

(6) A professional FINALE editor then takes all scores and converts them into a uniform format and appearance.

(7) Once the editing is complete, a score and parts are produced and read by a university band in a final effort to identify print errors or other flaws. There are generally “five sets of ears” present during the reading session.

(8) After final print corrections, the works are sent to RBC Music for publication and distribution the following spring.

So, who picks this committee and who is on it?  It is comprised of five career music educators who have had years of experience in the sight-reading room, both as directors and judges. Each member is appointed in joint consultation with the leadership of UIL, TMEA and ATSSB and serves a two-year term. Be assured they take their task seriously and spend many hours collectively and individually reviewing music for the following year’s contest.

With all of those fail safes in place why do concerns still exist? With this many eyes and ears involved, some argue that consistency should not be an issue. Yet, over the years, we hear comments similar to the following: “This year’s Level 4 music was harder than last year’s Level 4 music.” “The Level 1 piece is harder than the Level 2 music.”  “There are ‘tricks’ in Level 1 that do not appear in Level 2.”  “Level 2 has a time signature change and Level 3 does not have a time signature change.” The list goes on and on.

So what can we do?  In the case of the Level 1 and Level 2 pieces we began the policy a few years ago of having these two pieces written by the same composer. The goal was for the same person to structure these pieces in such a way that demands were consistent with expectations at each level. We also have tried to secure composers who actually do teach or have taught at this level. In other words, the composers who have written these pieces the last three years have actually “walked in the shoes” of the directors whose bands will read them.

Yet there still is variation from year to year. The most telling analogy might be a comparison with the development and release of a new automobile. There are years of design and testing. Prototypes are made and refined. Specialists evaluate and revise every component of the design. Road test after road test is conducted to “work out the bugs.”  Then the car appears on the showroom floor and the public begins to drive the vehicle. Within months, little problems, and sometimes major flaws, begin to surface. These are deficiencies that were never apparent in the entire design process.

Realistically, we experience the same kind of evolution in our sight-reading music. The music passes every level of scrutiny in the design and revision process. Experienced teachers review the scores and conclude that the music fits the criteria and is appropriate for the level intended. Then, when the music is read time after time in the contest setting at multiple sites across the state, questionable issues sometimes emerge. Yet at that point we are committed to a course that will take us through the contest season for that spring.

At the same time, we have to acknowledge the fact that there are many (both directors and judges) who feel the music is adequate for the task. These comments go something like, “No, it is not a great piece of music, but it fits the criteria and any band that has been taught the fundamentals of sight-reading should be able to achieve success.” I will take the first hit and say I tend to agree. I have heard non-varsity and even sub-non-varsity bands read these pieces with a high degree of competence. Do I feel that they are artistic compositions?  No. Do they fit the criterion that has been established for each classification?  Yes.  Should we modify the criteria to more accurately reflect the expectations for these contests?  Yes, if there is consensus that such changes are worthwhile.

Some say we should simply go back to the old system of selecting music off the shelf. Under that structure, a committee would meet at a sheet music store in San Antonio and in a period of 8 or fewer hours would select music to be read for the following spring. Anything close to the then limited criteria was considered. Inevitably, there were range issues, length issues, solo issues and instrumentation issues. Anyone remember "Command Z" by Ann McGinty? Unfortunately, there were also numerous cases of directors securing copies of the selected pieces from out of state and prompting their students on the music. Surely we do not want to return to that system and all of the baggage that accompanied it.

It is my vision that we move forward with the structure that we have. Let’s redouble our commitment to the selection of consistent, worthy music. Let’s evaluate our criteria and make sure it is consistent with our expectations and let’s continue to see competent composers who can supply music that fits our expectations. If we are vigilant and work together, we can continue to work toward our goal of providing consistent, educationally sound sight-reading material at every level.