Sunday, December 15, 2013

Counting and Measuring in the Classroom

At the end of every month, I open up my old-school attendance notebook and total data points in columns and rows. This is a habit I started back seven years ago as an adjunct in the Basic Ed classroom when I first wanted to know how I was doing and how I could improve it.

Sure, the creative free-flowing part of my brain saw the results of daily lessons and thought that everything was brilliant, just brilliant, but my business, quantitative-numbers side wasn't so easily persuaded by colorful thoughts. It asked persistently, "Do you have the data to back that up?" I know from a variety of business experiences that almost everything can be measured or counted: it's just a matter of asking the right questions and tallying the data.

If you can't count or measure something, it probably doesn't exist.

At first though, I wasn't sure what data I needed. So I started to collect and organize what I had at hand: number of students per session, programs represented, hours attended, tests taken, results achieved, and so forth. After one semester collection, I had a baseline of data points. One semester of data still didn't answer the question of how I was doing since I had nothing else to compare it to. So I kept on.

After another semester, I started to notice trends by comparing the first semester to the next. I felt better, but two semesters of data didn't seem to be terrifically valid since I suspected that the fall semester differs from the spring or summer semesters (which it does by quite a bit). What I needed was a full academic year including the summer term.

In the meantime, I continued to collect and to ask questions of the data and pass the reports on to the center managers and my associate dean. Some of the information has been important for Basic Ed funding (student numbers, attendance, and entry and exit tests: all part of GED scorecards), some for center BE scheduling, and some information, while interesting, didn't have immediate practical importance. For example, at one time I tracked how many students signed into the center lab just for testing and how many were there for GED or pre-program instruction: since testing has moved to other desks, that data is now irrelevant.

And, my old notebooks and monthly reports have provided a hard record of the evolution of the Basic Ed program at my two sites. Initial data quantified the old system of open academic skills labs, while the 2012-2013 Academic Year data showed how the school-wide Pathways initiative changed the flow of students at my two sites. Now, this fall (AY 2013-2014), the data shows upward trends possibly due to changes we made in delivery of both the GED/HSED content and the college prep classes. I say "possibly due" since the increase might have more to do with the GED 2002 series Closeout Campaign, than any structural changes we have made. Time will tell. Data collection is an exercise in patience as well as persistence. Ask me about trends in May or next fall.

In the meantime, the monthly exercise of data collection, comparison, and trends satisfies my personal business numbers curiosity and helps me recommend the best BE strategy at the centers. And, the data comes in handy when my center managers or associate dean have specific questions of me. When you have the data, you have the basis for intelligent evidence-based answers. Without data, well, your guess is as good as mine.

No comments:

Post a Comment