Sunday, April 05, 2015

Teach This, Not That

I think I've made it pretty clear in previous postings that I'm not a fan of standardization, but I realize that most teachers don't have a lot of choice and are required to teach to certain standards. Given that, teachers still often make choices about which standards they cover (since there's never enough time to cover them all) and how in-depth they go on each standard. Since my daughter is currently taking Algebra, I'm helping a home-bound student some with Algebra, and I occasionally teach Algebra myself, I thought I would pick an example from Algebra for my first (and perhaps last) "teach this, not that" post.

I was recently helping that home-bound student with the polynomial unit in Algebra 1. She did some marginally interesting topics, but - since I get to cherry pick for this post - she also did an assignment that I'll excerpt below.

Here's a screenshot of some of the problems she had to do.

And here's a screenshot of the answers.

Later in the semester she'll get to explore exponential functions a bit, so we'll see what types of activities she gets to do for those, but one typical way they could explore exponential growth would be a compound interest type of problem.

Now, every teacher is different, but based on my experience, if an Algebra teacher has to choose one of these two things to cover in an Algebra 1 course, they often pick the first one. Why? Because polynomials seems to "fit" better in the Algebra 1 curriculum and exponential growth does not, and compound interest problems are often presented in a way that it's mostly just plugging numbers into a formula and computing an answer.

From my perspective, however, it should be exactly the opposite. There may be some reason why some people might want to know that a certain polynomial is a quartic trinomial, but I have to think that for most of our students that's not a particularly good use of their time. Compound growth, however, is something that could be life-changing for them and their families (credit cards, car loans, mortgages, savings accounts, investments - and that's just financial applications), yet even when we do teach it, we often teach it as simply "plug-n-chug."

Here are two problems that I think would be interesting for every high school student to explore (and probably most of the high school staff, for that matter).

(Please note that while the math in these examples works no matter what, the feasibility of these scenarios is much more likely in a middle class or higher household. Those happen to be the students I work with, but I understand and empathize with folks who might be frustrated with these examples because they work with students in poverty.)

Scenario 1: Save for your retirement . . . before you graduate from high school.

Many students in my school get a job in high school, often over the summer after their sophomore year. If they work full-time over the course of that summer, they could easily gross $3000. Now, being the teenagers that they are, they are most likely going to want to spend a fair amount of that money. And they should. But I would suggest that by exploring the mathematics a bit, they - and their parents - might also want to invest it.

So, if this were my daughter (we'll see if she chooses to get a job after her sophomore year or not), I'd suggest she invest at least a bit of that money in a Roth IRA. And then I would contribute the rest up to whatever her gross earnings were for the year (we'll say $3000 for this example). Here's why:
  1. She won't owe any income tax on that low of earnings, so even though Roth IRA contributions are "after tax" contributions, this would effectively be "no tax" contributions for her, and all earnings will be tax free.
  2. I would suggest she invest that money 100% in a low-cost equity index fund, reinvest dividends, and never touch it again until retirement. (No reason not to be 100% in equities for this type of investment and time horizon.)
  3. Current assumptions (which I think will change, but we'll go with it), is that a current 16-year old might retire at age 67 or so, so we're looking at a 50-year + investment horizon. What will $3000 grow to in those 50 years? And there's the exponential growth question.
So, what will $3000 grow to in 50 years? Well, to be sure, no one can answer that question, but we can estimate based on a lot of data from past experience. (This is assuming that the way economies and capital markets work will not dramatically change, which I think is perhaps not a good assumption, but for estimating purposes it's the best we've got.) Since 1930, the long term annualized return of the S&P 500 is about 9.7%. If our 16-year old would achieve that kind of return over 50 years, she'd have about $307,000 at retirement. Just from that one summer's investment. If she works after her junior and senior years and puts in an additional $3000 each summer, she'd be looking at over $900,000.

But since we're talking about 50 years, I think we should at least consider investing in riskier equities that - over time - are likely to achieve a higher return. Since 1930 Large Cap Value has returned 11.2%, Small Cap has returned 12.7%, and Small Cap Value has returned 14.4%. Now, most folks would look at that and say that certainly the amount she'd have in the end would be higher, but I'm not sure they'd realize how much higher.

For the $3000 investment, the total after 50 years for Large Cap Value would be over $600,000, for Small Cap would be over $1.1 million, and for Small Cap Value it would be over $2.5 million. For $9000 investment (3 summers), triple those numbers. Keep in mind, that's all tax free, and all with not contributing any money to her retirement account after graduating from high school. (With the assumption that even if tax laws change, they will grandfather in existing accounts.)

Of course now would be a good time to talk with our student about inflation, and how that $7.5 million ($9000 for 50 years in Small Cap Value) in 2067 won't buy the same amount as $7.5 million today. So let's assume an average annual inflation increase of 3.5%. Lots of interesting discussions to have here about how students could use that information to calculate the end result but, simply discounting our returns by that amount turns that $7.5 million into about $1.99 million in today's dollars, which translates to being able to spend about $80,000 a year - (today's dollars) using the 4% rule. Still pretty darn good, which is why I think this is a worthwhile scenario to explore with students and why I think this might be a better use of time than learning about quartic trinomials.

Scenario 2: Don't go to college . . . and retire much earlier.

Yes, it's provocative, but that's part of what makes it interesting. I've written before about our assumption that college is the default goal for our students, but let's look a bit closer at the mathematics.

Like most parents, I've paid attention to the tremendous increase in the cost of attending college. We also started saving for college even before we adopted our daughter, using a tax-advantaged 529 plan. We invested in Colorado's plan because, in addition to earnings and withdrawals being tax free, contributions are exempt from Colorado state taxes (which is like earning 4.63% right off the bat). Due to our diligent saving and investing, and the benefits of compound growth (even with 2008), we have about $120,000 set aside in our 529 for our daughter's college expenses.

Well, that sets up an interesting scenario for a problem about exponential growth. What if she didn't go to college and, instead, invested that money now (we'll take a tax hit since it's not being used for college, but I'm willing to cover that), immediately got a job that didn't require a college degree, and continued to add to that investment over the years? Lots and lots of messy details here, which is why it's such a good problem situation to work through with students, but let's look at a simplified version with lots of assumptions just to get the feel for it.

We'll use the same investment return information from Scenario 1, including investing in index funds with 100% in equities, since she's young and has a long investment horizon. We'll assume that she'll get a job paying at least $25,000 per year to start off with, and that each year she'll get a raise that's at least equal to inflation. We'll also assume that she'll be able to save and invest an additional $3000 each year. I realize that can be tough when she's starting at $25,000 per year, but that works out to a reasonable 12% of her income, and perhaps we'll let her live at home for the four years she would've been in college to help her start off. I'm going to make one more assumption, which is that she could retire comfortably on $40,000 per year. That's for just her, if she gets married she would obviously have additional income, additional investments, and additional expenses that would complicate it a bit; but as a family of three we are currently spending about that much (when you take away what we're saving for retirement), so I don't think it's an outrageous assumption for one person.

Well, the numbers are pretty interesting to play with, especially with the excellent FIRECalc tool. Lots of choices to make here as well, but on the first tab (Start Here) I put in $120,000 portfolio to start, with anticipated spending needs of $40,000 per year (today's dollars), and wanting it to last for 80 years (50 years after she retires). I left the second tab blank, meaning I'm assuming no social security or pension income (there probably would be some, but we'll leave it at 0 for now). On the third tab (Not Retired?), I put in a retirement year of 2045 (so that's assuming working for 30 years, starting now), and that she'll add $3000 to her portfolio each year (adjusted for inflation). For the fourth tab (Spending Models), I chose Bernicke's Reality Retirement Plan. The fifth tab (Portfolio), I adjusted to 100% equities. When I do all that, it gives me this. (You'll have to click submit if you follow that link to see the results page yourself, but here's some of the verbiage):
Following the "Reality Retirement Plan" as described by Ty Bernicke, withdrawals after age 55 are reduced by 2-3% per year until age 76.

Because you indicated a future retirement date (2045), the withdrawals won't start until that year. Your contributions will continue until then. The tested period is 30 years of preretirement plus 50 years of retirement, or 80 years.

FIRECalc looked at the 64 possible 80 year periods in the available data, starting with a portfolio of $120,000 and spending your specified amounts each year thereafter.

Here is how your portfolio would have fared in each of the 64 cycles. The lowest and highest portfolio balance throughout your retirement was $120,000 to $56,587,349, with an average of $15,925,319. (Note: values are in terms of the dollars as of the beginning of the retirement period for each cycle.)

For our purposes, failure means the portfolio was depleted before the end of the 80 years. FIRECalc found that 0 cycles failed, for a success rate of 100.0%.
You really should explore FIRECalc some more but, based on a lot of baked-in (but not half-baked) assumptions, it tells me that for the 64 possible 80 year periods that the historical data supports, not once would she have run out of money (and usually would leave quite an estate). Note that has her retiring at the age of 48 and living until 98. (If you want to change it to constant spending power instead of Bernicke's Reality, then you still have an 81.3% success rate. But working just 3 more years, so retiring at 51, would have had a 100% success rate.) Keep in mind all of this is assuming no pension or social security income, which you definitely would have if you worked for 30+ years. After playing around, you can even discover that she could retire in 2036 - so at age 39 - with a 97% chance of success (and with 59 years of retirement, and usually a sizable estate). So, at an age when some college graduates are still paying off their college loans, she could be retired. Provocative enough?

FIRECalc even lets you download a spreadsheet based on your inputs that you could analyze with students to examine (and perhaps manipulate) the formulas. Again, I would suggest this is not only more interesting than quartic trinomials mathematically, but also practically for students. And, of course, there's nothing preventing our student from doing both Scenario 1 and Scenario 2.

That's just two examples. Lots and lots more you could do with debt (credit cards, car loans, mortgages), governmental policy (budget, entitlements, social security, medicare), and on and on and on. But I don't know anyone that really does, because there's always one more standard we need to cover, and students just might get asked to name a quartic trinomial on some test sometime. It's probably a good thing, though, since we wouldn't want our students to be financially independent and able to retire before we can, would we?

Friday, April 03, 2015

Mission Impossible

My school, like many schools and other organizations, has a mission statement. I can't tell you what it is. This despite the fact that we've had it for just over seven years now and I - along with the entire staff that was here at the time - helped create it. While I haven't done a scientific survey, I feel fairly confident in saying that if you asked five random staff members at my school what our mission statement is, there's a pretty good chance none of them would be able to tell you. And I feel even more confident that if you asked five random students at my school, they wouldn't know either. Which means it's mission impossible.

That doesn't mean I don't generally like what's in our mission statement (and, for that matter, our longer vision statement). You can read them here (pdf). I do want students to achieve their potential, collaborate and be life-long learners, and contribute to society. The problem is that when you have a mission statement that no one knows, and that has generic statements like that, it ends up being pretty meaningless. I've written about this previously, talking about core values and whether students can articulate the vision, but clearly it's still bothering me. I'd like something that's meaningful, and that we can post in each and every classroom so that each day, students and teachers could refer to it. Any student would be able - and expected - to ask, how is what we are learning today going to help fulfill our mission? And if the teacher doesn't have a good answer, then they should stop teaching it. Similarly, every teacher would be able - and expected - to ask the same of what students were doing with their time.

So I was feeling all good and outraged, but then I asked myself, "What's my mission statement?" Uh-oh. I don't have a good answer for that, even though it's something I've thought about. Some of the Language Arts teachers at my school do an activity with students called "What's Your Sentence?," based on an idea in Daniel Pink's Drive. In the past, they've put out an email to staff asking for their sentence that they can share with students, and I always star it in my email and stare at it for a week before feeling guilty and not replying. This will be no surprise to regular readers of this blog, but my problem is that I can't figure out just one sentence that captures it for me.

I usually start with a sentence something like this:
To help those around me become more passionate learners.
But then I start picking at it. Shouldn't I include myself in there? But that makes the sentence awkward.
To help myself, and those around me, become a more passionate learner.
And learner about what? Do I really want them to become a more passionate learner about something that isn't meaningful for them? Say, perhaps, our somewhat arbitrary curriculum? And so I try something like,
To help those around me discover and pursue their passions.
But then that doesn't explicitly mention learning, and becoming a better learner. And it leaves out "myself" again. So then I try something like,
To help myself and those around me discover and pursue their passions by becoming more passionate learners.
or perhaps
To help myself and those around me discover and pursue their passions by becoming better learners.
Yuck. It's about now that I remember why I was better suited to teach mathematics than language arts, and then I dial back my outrage (at least a little) about my school's mission statement. So I try changing the order,
To help myself and those around me become better learners and discover and pursue their passions.
Maybe a little better, but it's awkward with the multiple 'ands', and I still don't quite like the phrase 'better learners.' So then I'm reminded of another post where I reference something David Jakes wrote talking about culture, and I wonder if somehow my mission statement should try to talk about a culture of learning.
To help myself and those around me develop a culture of learning; one where we help each other discover and then pursue our passions.
Getting closer, but I'm still not sure I can really do it in one sentence. But when I start adding sentences, it gets too involved and less clear. So does that mean it's "mission impossible" for me as well? If I can't articulate what I'm trying to accomplish, what my purpose is, does that mean that I'm doomed to fail? Maybe.

What about your school's mission statement? Or your sentence? Do you have something straightforward and meaningful that your school - and you - can rally around? I'm obviously still struggling with my own, but I think for those of us working in schools, it's something important to talk about. And, even if we don't come up with one perfect sentence or one perfect mission statement, I think we should be willing to post what we do have in each and every one of our learning spaces, and ask our students to hold us just as accountable as we hold them.

Wednesday, April 01, 2015

FaceTime + Stoodle + Desmos = Virtual Algebra Help

Nothing earth-shattering in this post, but I thought I'd share in case it might give someone an idea.

A friend of ours is a 15-year-old freshman at a nearby high school. Unfortunately, she's been dealing with some serious medical issues that have kept her home-bound almost all of this school year. She has a home-bound tutor provided by the school district and is doing her best to try to keep up, but it's tough. Her medical condition causes her to be in pain and often extremely fatigued, so it's very difficult for her.

For the last couple of months I've been trying to go over on the weekends to help her with Algebra, which allows her to concentrate on her other subjects with her district-provided tutor. Because of her fatigue, however, she often has to cancel or cut short our sessions. In an effort to provide more of an "on-demand" option for her, so that I can help her on short notice when she does have some energy (and cutting out the drive time to get to her house and back), I cobbled together a technology solution that, so far, seems to work reasonably well.

Our district is using Agile Mind as the textbook for Algebra, so the text is online. The teacher emails her all the worksheets from both the in-class work and the homework, as well as which parts of Agile Mind they go with when applicable. (I'm not a particularly huge fan of Agile Mind or worksheets but, as worksheets go, these are better than average, with lots of material based on the work of the Charles Dana Center.) She tries to figure out the concepts on her own (she's a talented math student), but it's tough without being in class, so I end up doing a lot of questioning to help with teaching/explaining, as well as just overseeing her procedural work.

This is what we have set up for the virtual option.
  • She FaceTime's me from her iPad to my Mac. She rests her iPad on a stack of books and uses the rear-facing camera to stream video of the worksheet she's working on as she's working on it. I see a reasonably large version on my Mac, and we can talk back and forth. (She can also see me if she wants/needs to.)
  • Since I'm mostly questioning, she does most of the writing, but at times I want her to be able to see something I'm writing (either working out an example, drawing a picture, creating a table, etc.). Since I'm using the Mac's camera for FaceTime, it's not particularly convenient to use paper and then hold it up in front of the iSight camera. So for my writing we use Stoodle. It gives us a simple, shared online whiteboard in real time. (She can write on the Stoodle as well and I can see it, but I didn't want her to then have to copy it to the worksheets that she has to turn in.) I use my wife's or my daughter's iPad to make the writing/drawing easier, and she has the Stoodle open on her Macbook.
  • Since this is Algebra, Desmos is very helpful as well. We've used it a lot in our face-to-face sessions, and now we can use it virtually as well by sharing a link or copying and pasting into the Stoodle. (The Desmos link is not "live" to both of us simultaneously, but still works pretty well).
I think face-to-face is still better and more productive but, given the unpredictability of when and how long we can work together, this has worked really well to make the most of the times when she does have energy.

Wednesday, March 25, 2015

Whose Test Is It?

This is a long post, and rambles a bit. My Mom said the other day that I'm wordy, and she's right. But it's my blog and I'll be verbose if I want to, verbose if I want to . . . Also, keep in mind that I would prefer to radically change what we do each day in school (school should be "different", not "better"), but this post is written from the perspective of how we can do what we are currently doing better.

While it's clear I'm not a fan of standardized tests such as PARCC, there is one advertised feature of PARCC that I think is an improvement over previous state-mandated testing we've done. It's the idea that we would get the results back faster, in which case - whatever their value - we at least would perhaps be able to use them to help students. Now, so far, I don't see any indication that we actually will get those results back faster (end of the school year is better than next fall, but still not very helpful), but perhaps as they iron out the wrinkles that will happen.

Research indicates that timely and effective feedback is key for student learning growth so, if the point of assessment is to help students learn more effectively, then both "end-of-the-year" and "next fall" don't do us much good. While we don't have much (any?) control over state assessments like PARCC, we do have control over teacher generated and given assessments in our classrooms. So what frustrates me is the timeliness and effectiveness of the feedback we often give, because this is something we do have some control over.

The genesis of this post was when a student I know well took three major tests on the Friday before Spring Break. Which means that, in the best-case scenario, this student won't receive any feedback for at least ten days. How "timely" and "effective" do you think that feedback is going to be? Now, let me be clear, as a teacher I've done this before as well. You want to finish a "unit" before a scheduled break in school and you want to assess before that break while it's still fresh in their minds. But that doesn't make it right and, as I've gotten older (and hopefully wiser), I've done my best to resist that urge. While not a perfect solution, I at least tried to give any assessments the day before the last day before a scheduled break so that they could receive feedback before going on break.

Which brings up the next issue, which is how quickly we get these assessments back to our students. Now, in this case, it's going to be at least ten days due to Spring Break, but what about assessments that aren't given right before Spring Break? Here's my thinking. If something is important enough that we are going to assess (and grade) all of our students at one point in time, and we are expecting all of our students to be ready and to take that assessment, then we should be willing to commit to return that assessment to them, with feedback, the very next day that class meets.

This, obviously, is an opinion that some folks will take issue with. They'll point to a limited amount of time for teachers, and many competing obligations, and the sheer amount of time it takes to grade assessments of multiple sections in only one to three days (depending on your schedule and weekends). I readily acknowledge those issues, I just don't think they are a worthy excuse. Again, if this assessment is important enough to give to all your students at one time, and if the goal of the assessment is to determine how well they know this essential material and then to help them learn anything they are still confused about, then as teachers we need to get this back to our students with meaningful feedback as soon as possible. While immediate feedback is often best, slightly delayed (the next day that class meets) feedback can be useful as well. Greatly delayed feedback? Not so much.

So that addresses "timely," but what about "effective?" What does effective feedback look like? I am in no way an expert on this, and there are many books you can read to help you with this, but I do think I can identify a few practices that aren't effective. Let me focus on two of them. First, feedback that is just a grade, or perhaps a grade with a few things circled, is usually not going to be effective feedback. Second, an assessment that you don't give back to the students and allow them to keep is usually not going to be effective.

Letting students keep assessments is a controversial topic for some teachers, so let's explore that a bit. In my experience, every reason given for this basically boils down to the same reason: cheating (with a side helping of time). Some teachers don't like to let students keep the assessments because they are worried other students will use them to cheat, either because they were absent when the assessment was given and need to make it up, or that students will pass the assessment along to future students. There's a fairly easy way to solve that problem, of course, which is to have several versions of that assessment made, which is where the side helping of time comes in - teachers will say they simply don't have time to create multiple versions of their assessments. I disagree.

Creating quality assessments is obviously a complicated issue that can't be addressed in this post, but the majority of assessments I see in schools fall into three categories: textbook-generated assessments, teacher-created assessments, and essay-type assessments (either textbook or teacher generated). (There are obviously other types, but I think these three do a fairly good job of putting them into categories.) For those teachers that use textbook-generated assessments, the software will easily create multiple versions of the assessment for you. For those that decide to go a little further and create their own assessments, it will take a bit more time, but it's not that hard to create multiple versions of the same assessment. (And, if you're really good, which I'm not, great assessment questions are really hard to cheat on anyway, so you don't need multiple versions.) Essays are both easier and tougher. Easier because they are harder to cheat on, tougher because they do take a fair amount of time to evaluate and provide feedback on. In my perfect world, though, that feedback is being provided throughout the writing process, so there's really not one "due date" where the essays have to be turned in and evaluated en masse.

There are some other strategies that I think are helpful. In math and science classes especially, for example, I still see most teachers giving unit assessments, that take a long time for students to take and a long time for teachers to evaluate. Why not give shorter assessments more frequently? This not only makes it easier to provide more timely feedback, but it gives students more frequent feedback as well. I also see many teachers trying to assess everything, instead of just what we've identified as being essential. Which is better, assessing everything and providing delayed and incomplete feedback? Or assessing only a few really important things, and giving our students thorough feedback in a timely fashion? I would suggest the latter.

Finally, let's talk about final exams. Many high schools, including mine, give final exams during the last week of the semester. At the end of first semester we have winter break, and then students may or may not have the same course and teacher when the next semester starts two weeks later. At the end of the second semester, students go to summer break. The vast majority of students don't get any feedback (other than the grade on the online portal) on these final exams. Some folks will suggest that since these are "summative" assessments, it's not that important to give feedback. I think that argument only works if you view each course as its own isolated world with a goal of finishing the course and getting a grade. If we truly value what we are teaching in that course and think that it's important for students to learn, then it doesn't "end" when the course ends. From this viewpoint, all assessment is formative.

If we're going to continue to have final exams, then I have a simple suggestion: don't give them on the last days of school each semester. Give them a few days before and then allow each class to meet at least one more time after the final exam in order to give the assessments back to the students and provide feedback to them. With my school's schedule, for example, that would require two class days after final exams, one running a MWF schedule and one running a TR schedule. Some folks will argue that students won't use that time well, or might not even show up, and that may be true. But if that's the case, then what does that say about what we're doing in the first place? If what we are doing is truly valuable, then students will want to show up.

For me, it boils down to what is the purpose of assessment. Whose test is it? If it's designed for the adults, then the prevailing practices are probably just fine. But if it's designed for the students, to help them learn and grow and be successful, then we need to do some rethinking about how we assess and provide feedback to - and for - our students.

Monday, March 16, 2015

Monitoring Student Use of Social Media

This past weekend a story regarding Pearson monitoring social media for "security breaches" related to PARCC was a popular topic of conversation in my network (original story, although the server often has trouble handling the traffic). I'm not going to focus so much on that story here, as many others have written about it, other than to point out one thing. While students don't seem to have the opportunity to agree or disagree to the terms of service of PARCC, our states, school districts and schools do. We all agreed to this, this is part and parcel of administering PARCC to our students. (Not sure if it was just Colorado, but as a proctor I had to sign a form agreeing to the terms of service.) So I think it is worth some conversation at the state, district and school level about whether we are okay with this or not. Schools can't really blame Pearson for doing what they said they would do (although others can).

In response many folks have wondered why we aren't outraged by the many school districts that are also monitoring students' social media use. Now, to the best of my knowledge, my district is not actively monitoring our students use of social media. But in some respects, I am. Let me explain.

As part of my presence on Twitter I engage in at least two activities that wander into the territory of monitoring. I have a search column set up for the name of my school, primarily so that I can retweet mentions of my school and occasionally answer questions or address concerns. And - as a result of various interactions over the years - I follow some students, both former and current. On occasion, both of these activities have resulted in me coming into contact with student behavior that I have acted upon. This ranges from contacting a student to suggest that a particular tweet might not be looked favorably upon by a college admissions officer or future employer (and discuss digital footprint with them), to meeting with a student and their counselor because there is some concern the student might be engaging in behaviors that could be harmful to themselves or others.

Now, I don't think this is "actively monitoring" my student body. I am not attempting to monitor all student accounts, nor am I actively looking for "misbehavior" on the part of our students. But I readily admit that this could be a slippery slope. It's all well and good for me to say that I'm not surveilling our students, but are they just supposed to trust me on that?

This is something I've thought about a lot. A. lot. And I'm still not completely comfortable with where I've landed, because I think this is a very complicated subject and the parameters around it are constantly changing along with the uses of social media. But, at the moment, this is my best attempt to thread the needle of privacy vs. obligation. If we see a student in need, are we not obligated to try to help? I've chosen to err on the side of caring, but that doesn't mean I might not cross the line.

Part of the way I'm currently viewing this is through the lens of a parent. I ask myself as a parent of a teenager, if another caring adult noticed something of concern in my daughter's social media activity (or any activity for that matter), would I want them to ignore it? I would not. On the other hand, I wouldn't want her school (which, conveniently or inconveniently, is also my school) to be searching through her social media activity looking for something we deemed "inappropriate." It's a fine line.

Because this is such a tricky issue, some school districts have implemented policies to limit or forbid employees' use of social media in relation to their students. My district does not currently have such a policy, but they are working on a draft of one (including possible rules around texting). I think this is a mistake. We don't have policy around whether a teacher can talk to a student in the grocery store or at a volleyball game, whether they can call a student's home or what they are allowed to say to them in the hallway, so we don't need policy specifically regarding social media. Our existing policies cover social media just fine, we don't need a new policy for every new technology or social media platform that is created. As near as I can tell, these policies are really not about student safety, but about school district liability. I don't think anyone believes that simply having a policy in place would stop an adult who means harm toward a student to not act, the policy is just there so the school district can say we have a policy against it.

Our students are active in these spaces. We have a choice, we can ignore these spaces and implement policies designed to protect our institutions, or we can thoughtfully engage with our students and try to help them learn, grow and stay safe. I'm reasonably comfortable with my current position, although I'm constantly reexamining it to see if my thoughts have changed. I'm curious as to how others navigate this issue. Is it okay to "infringe" on a student's privacy if they are at risk? How do we determine they are at risk? Who decides?

Monday, March 02, 2015

#myoptoutletter

Our daughter will be opting out of the PARCC testing this spring at my high school. Some folks will applaud this decision, others will vehemently disagree, but we thought it was important to share our thinking. This is the letter we submitted to my administration and the school board this morning.



February 28, 2015
To: Arapahoe High School Administration and LPS Board of Education

This letter is to let you know that our daughter will be opting out of the PARCC testing in the Spring of 2015 (both the PBA and the EOY). This request is not meant in any way to reflect poorly on Arapahoe High School or Littleton Public Schools. Our daughter loves her teachers and frequently comes home and tells us what a good job they are doing, with specific examples of what she thinks they did well. But as educators with a combined 48 years teaching every grade level (except Kindergarten and 2nd grade) from Pre-K through 12th, as well as professional development for adults, we do not feel like this testing is in the best interests of our daughter or the school.

We feel that the skills that this testing purports to measure reflect a very narrow and flawed version of what it means to be educated; of what it means to learn and to have learned. We don’t necessarily think that the standards themselves are bad; as standards go most of the Common Core State Standards (and the Colorado modification of them) are well written. To paraphrase Yong Zhao, there’s nothing wrong with the Common Core State Standards, as long as they weren’t common and they weren’t core.

While at times we may disagree with a specific assessment one of her teachers gives her (the content, the format, or the way it’s delivered), in general we believe that her teachers are in the best position to assess her progress as a learner (in conjunction with our daughter herself). More importantly, we believe these teacher-given assessments at least have the potential to help her grow as a learner. Standardized testing such as PARCC, however, is mostly designed to meet the needs of adults.

Instead of taking the tests, she will instead use that time to learn. She might read a book, or work on assignments from her teacher, or watch videos on YouTube of things that interest her, or perhaps just catch up on sleep to compensate for the ridiculousness of beginning school for teenagers at 7:21 am each day. Whatever she does, it is more likely to contribute to her growth as a learner than taking the tests, and less likely to negatively impact her and her school as a whole.

We don’t just think that these tests are bad for our daughter, we believe these tests are bad for all the students at Arapahoe, and for Arapahoe in general. These tests are forcing teachers to narrow their focus; to value a fixed, pre-determined set of skills that someone else has decided that all students need over the needs and desires of the living and breathing students that are actually in their classrooms. While there are many criticisms we would make about the curriculum currently being taught and the restraints that imposes on both teachers and learners, we still put our trust in Abby’s teachers to make the best of that curriculum.

But in our current environment, the mandated testing is overwhelming teachers’ abilities to make decisions in the best interest of their students. Because the results of these tests are being used to evaluate teachers, teachers and administrators are being forced to toe the line in order to keep their jobs. While some folks would argue that this “only” represents 50% of a teacher’s evaluation, we have both seen how this has come to dominate all the discussions of teaching and learning in our schools. I would ask school administrators the following question: If there is a teacher who you have observed many times over the years that you feel is a master teacher, and yet the results of mandated testing over a narrow band of skills don’t support that, would you really change your evaluation of that teacher? There is so much more to teaching and learning than students simply performing well on a single test on a single day.

Make no mistake, we believe in high standards, we just don’t think that this approach actually helps promote them. We believe you can have high standards without being standardized; in fact, we don’t think it’s possible to truly have high standards if you are standardized. The goal of K-12 education is not to help all students master a pre-determined, fixed set of knowledge all at the same time and at the same pace. Algebra may (or may not) be important for all students to learn, but it is ludicrous to state that all students must learn it by the time they are fifteen years old. Why not fourteen? Or sixteen? If a student decides they need - and want - to learn Algebra at eighteen and master it then, is that so bad?

Anyone who has had children, or has met more than one of them, knows that each and every student is different and learns differently, yet we continue to act as if they are widgets on an assembly line, performing the same processes for the same amount of time on each one of them, and expecting that they will all turn out identical at the end of the line. Not only is this not true, we shouldn’t even want it to be true. We say we value diversity and each individual student, that we value and cherish the individual personalities and strengths of each and every child, yet we’ve developed a system that values conformity and compliance over individuality and initiative. We say that we value critical thinking, yet we are apparently unwilling to model it for our students.

We believe in a vision of education that focuses on the needs of each student over the needs of the system. We believe that school should be a place where students are encouraged to pursue their passions, and then actually prepare them to achieve those passions. That doesn’t mean we don’t value community; we believe one of the greatest strengths of the concept of public schools is bringing together students with different strengths and different backgrounds into a common space where they can learn and grow together. Where they can find others who share their passion, but also learn with and alongside those who have other passions. We believe that the way you meet the needs of society is by meeting the needs of each individual student. If you truly meet each student’s needs, then in the end you will meet the needs of society.

For all of these reasons (and many more, but this is already fairly long), we are choosing to opt our daughter out of testing. We have given her the option of opting out each year but this is the first time she has chosen to do it; previously she has never wanted to stand out and “be different” than the other students. She is aware enough now to understand, however, that taking these tests is not only not in her own best interests, but also not in the interests of her friends, classmates and teachers. We think this is important enough that we would give her this option even if it did “negatively” impact Arapahoe or Littleton Public Schools but, thankfully, with the recent changes at the state level surrounding the 95% participation rate, that will not happen.

Which is why we also have a request for the leadership of Arapahoe and Littleton Public Schools. Littleton Public Schools is the highest scoring district in the Denver Metro area, and one of the highest scoring districts in the state, and Arapahoe scores very well as a school. This puts the school and the district in a position where others might listen if they stood up and said this is not in the best interests of our students. A school and a school district that always come out looking good under this system is in the unique position of making the case for why this approach is fatally flawed. Instead of simply reacting to events and the decisions of others, we would ask you to lead.

We - the students, parents, educators and citizens of Colorado - need you to be proactive, not reactive. Instead of reacting to and appeasing the folks who are imposing this system on us, we need you to advocate for a different version of learning, a truly higher standard of what we expect from our schools, a vision for what school can and should be. We don’t need schools that are “better” at scoring well on standardized tests, we need schools that are different, and we need you to advocate for that vision and for our students. We hope you will. Our students deserve nothing less from us.

Sincerely,

Karl and Jill Fisch



More Information

Colorado Department of Education

 Denver Post

United Opt Out
Update 3-4-15: LPS has a page (not sure if it's brand new or was just updated) with FAQs about PARCC/CMAS that includes a mention of opting out.

Tuesday, February 10, 2015

Real Leaders Sometimes Lose

This post is going to veer away from the usual education focus and slightly into politics, but I think it's related.

The Denver Post ran an editorial today titled, Repeal TABOR? It's not happening, where they said,
Gov. John Hickenlooper told an assembly of school administrators last week what some of them clearly didn't want to hear: that any effort to repeal the Taxpayer's Bill of Rights would be "doomed." But Hickenlooper is very likely right about the odds, and education leaders shouldn't waste their time urging political leaders to undertake the electoral equivalent of the Charge of the Light Brigade.

Remember the thrashing that Amendment 66, which would have raised the income tax for education, sustained two years ago? Any attempt to repeal TABOR outright could easily face an even worse drubbing.

Hickenlooper was responding to a request by Boulder Superintendent Bruce Messinger that the governor lead a campaign to repeal TABOR, according to Chalkbeat Colorado. "We will need the governor to lead that charge," Messinger said.

To which Hickenlooper replied: "To take on that battle ... right now, that would be a doomed effort."

Indeed it would. Opponents of a repeal effort would have a field day portraying the campaign as contemptuous of popular opinion and bent on huge tax hikes. 
The Denver Post, like many media outlets, pundits, and politicians themselves, has succumbed to the viewpoint that governing (and politics) is always (and only) about winning. It's not.

I find it interesting that nowhere in that article does the Post's editorial board actually discuss the merits of repealing TABOR, it's only about whether it's a winning issue or not. And, to be clear, they are probably right, it would be a long shot to pass. But that's not the point.

What we need is real leadership, from Governor Hickenlooper, the state legislature, and even the Denver Post. Real leadership would realize that TABOR, Gallagher and Amendment 23 all hamstring our elected leaders from actually governing. That they are a horrible way to govern in a representative democracy, and they effectively make it impossible for our state government to operate efficiently and effectively, and to plan and implement policy.

Real leadership would look at the polls, realize it's most likely a losing issue, and take it on anyway. Real leadership would realize that this is so important that it's worth spending a lot of time and effort educating the public on it, even if it loses. Real leadership would propose repealing all three amendments and ask the voters to let their elected leaders actually govern.

It's not "contemptuous of popular opinion" to see a serious problem and then try to educate voters on why it's a problem and propose a solution. How many times in history has "popular opinion" been absolutely, utterly wrong and immoral? Would the Post suggest that Abraham Lincoln, Susan Anthony and Martin Luther King, Jr. (to name just a few) were "wasting their time?"

It may indeed be a doomed effort, but that doesn't mean it's not worth fighting. And sometimes even doomed efforts succeed. After all, I'm sure the Post thought that when a little known junior Senator from Illinois announced his candidacy for President in 2007, it was a "doomed effort." In fact, I bet when a little known bar owner, who was a failed geologist, decided to run for Denver mayor, that was a "doomed effort" as well. I wonder whatever happened to him?

The basic problems with TABOR/Gallagher/Amendment 23 can be easily explained in less than five minutes. What if Governor Hickenlooper spent five minutes explaining those problems each and every day at each and every event he was at? And what if other like-minded leaders in Colorado - on both sides of the aisle - also took five minutes at each and every stop in their day and described the problem? And what if the Denver Post, instead of focusing on winning and losing and the horserace aspects of politics, actually tried advocating for a solution?

So many of our problems today can be traced back to a lack of leadership. Whether it's education policy, the dysfunctional United States Congress, or the Colorado State Government being unwilling to have an honest conversation with the voters of Colorado about how TABOR, Gallagher, and Amendment 23 are crippling their ability to govern, our problems come down to folks being more concerned about political "victories" than actually trying to find solutions and solve problems.

What we need is leadership. Real leaders sometimes lose, but they choose to fight the battle anyway, because they know it's the right thing to do. And because they know that leading sometime means being out in front of the crowd and that, over time, you can bring the crowd along with you. That's not being contemptuous of public opinion, that's leadership.

Wednesday, February 04, 2015

If I Had A Million Dollars

We first started seriously discussing laptops for our students in the fall of 1999. At that time, the obstacles were cost and infrastructure (wireless), and not everyone was convinced that they would help students learn. Over the years the cost came down, the infrastructure began to be built out, and more and more folks were convinced that laptops would not only be helpful for students, but essential to their learning process. Yet still we didn't do it.

It took until the Fall of 2012 to pilot a program, and then the Fall of 2013 to roll it out for all Freshmen at AHS. We did it via a Bring-Your-Own-Device program, counting on a large percentage of our students to bring their own, and then we would provide laptops (netbooks) for those who couldn't afford one or didn't want to bring one. The district provided support in terms of helping us with a few netbooks and, more importantly, guaranteeing that if we didn't get enough students bringing their own, they would help us financially to make up the difference. It turns out that our students did bring their own in the expected amounts (roughly 65% that first year, and now well over 70%), but it was nice to have that insurance. Since then we've now rolled it out to two classes (this year's Freshmen and Sophomores), and next year will roll out it to a third class (Freshmen, Sophomores and Juniors), and possibly to our Seniors as well depending on a few things (more on that later).

Two weeks ago my school began receiving what will ultimately be 993 Chromebooks from our district. These weren't purchased because we've finally decided that laptops are important enough instructionally for our students to provide them, we're receiving them due to mandated state testing. Because both the PARCC and the CMAS tests are taken via computer, and because we can't sufficiently lock down the netbooks we had previously, the district decided to replace them with Chromebooks - and, of course, we had to add significantly more in order to test all of our students. After sixteen years of not being willing to spend money to support our students instructionally, we are willing (actually, forced) to spend money to support testing. Our Superintendent told us in a faculty meeting that district-wide more than $1 million dollars was being spent to purchase Chromebooks.

Now some folks might argue that I shouldn't complain, we are getting laptops that we will be able to use instructionally when we are not testing. (And, given this influx, this may allow us to accelerate our rollout to include Seniors next year - one year early.) I am certainly appreciative of this, and we will do our best to take full advantage of it, but I still think it's important to note the priorities of our national and state leaders, and what actually makes school districts spend money they otherwise wouldn't.

Since we have so many of our own students bringing their own devices, much of this $1 million will end up sitting most of the time in carts, unused (once we've rolled out Connected Learners to all four grades). So I wonder what else we could've spent $1 million on? I'm sure we could all come up with lots of ideas, but here's one pretty simple one: let's hire more teachers.

Now, I realize that $1 million doesn't go very far when you're talking about hiring teachers, but what if we did this. What if we hired eighteen teachers and provided six teachers each to three elementary schools in our district that we identify as being the most at-risk. Each school could decide how best to utilize those teachers. One school might decide to create one more class at each grade level (K-5), thereby lowering class sizes and student-to-teacher ratios across the board. Another school might decide to leave classes the same, but have one teacher work at each grade level, helping the existing teachers co-teach, or working with individual or small groups of students. Or a school might choose to place all six of those teachers in K-2, creating two extra sections at each level. How many of you think any of these ideas - or some permutation I haven't enumerated - would have a more positive effect on students than state-mandated testing? Which is more likely to change students' lives?

The problem with testing isn't limited to the dubious quality of the data we get when we purport to measure what's "important" for students to know. It's the opportunity cost of the testing. It's not just the $1 million spent on chromebooks that will often sit in carts instead of spending it on something that will help students learn. It's the tremendous monetary value of the staff time that goes into administering these tests including, but not limited to, a district assessment coordinator and their secretary, building-level assistant principals and counselors that spend an inordinate amount of time coordinating these tests, and the time that teachers spend in proctor training for these exams.

And then there's the value of the lost instructional time, not just the time students spend taking the tests, but the time taken in class to prepare for the tests (even teachers who don't do test-prep are very much encouraged to expose their students to the format of the test ahead of time), and the lowered quality of the instructional time that we typically have on testing days (where we test in the morning and have altered schedules in the afternoon).

And then there's the effect on students, both psychological and philosophical. Where they are stressed by the testing, and their motivation is decreased by constantly being told what they aren't good at. And it's the philosophical message we send to students, that being able to prove that adults are doing their job is more important than the students' learning.

If I had a million dollars, I'd buy you the opportunity for more learning, not more testing.

Friday, January 09, 2015

What If We Just Tried It?

Michael Feldstein, Dave Cormier (1, 2), Stephen Downes and many others in the comments had an interesting discussion around student learning and engagement that's worth your time to check out. While I agree with Chris Lehmann that perhaps engagement isn't always the word we're looking for, I think the discussion in the above posts is using engagement in the right way; the students aren't just engaging in the activity, but in the learning.

You should read the posts (and the comments), but I wanted to pull a few quotes out to highlight and think about.
So. In this case, we’re trying to make students move from the ‘not care’ category to the ‘care’ category by threatening to not allow them to stay with their friends. Grades serve a number of ‘not care to care’ purposes in our system. Your parents may get mad, so you should care. You’ll be embarrassed in front of your friends so you should care. In none of these cases are you caring about ‘learning’ but rather caring about things you, apparently, already care about. We take the ‘caring about learning’ part as a lost cause.
The problem with threatening people is that in order for it to continue to work, you have to continue to threaten them (well… there are other problems, but this is the relevant one for this discussion). And, as has happened, students no longer care about grades, or their parents believe their low grades are the fault of the teacher, then the whole system falls apart. You can only threaten people with things they care about. (Cormier, emphasis mine)
I've had many discussions with fellow educators around these same ideas, and I find it interesting that we so quickly dismiss "caring about learning" as a lost cause, and therefore have to find all these other ways to coerce students into learning. I wonder if we would just step back and really think about that statement, and what it says about what we're doing, if we just might figure out that we're doing it wrong.
Why bother learning how to use all these “effective instructional strategies” when people aren’t even going to engage with them? (David Wiley, in the comments).
For my purposes, I might modify that to say "when people aren't even going to care about what they're learning." More and more I'm struggling with the idea of learning about what someone else cares about, for someone else's sake, which is what I feel like we're doing. Yes, folks will argue it is still for the student's sake, but if they don't care about what they're learning, then aren't we putting our needs in front of theirs?
The issue for me, then, is more the mismatch between my students’ desires to connect and what I, or the curriculum, wants them to connect to. Almost all my students want to connect to certain people, ideas, skills, and professions, but most of them do not want to connect to academic writing, the subject I happen to teach. Schools are not adept at, or even interested in, identifying students’ existing interests and playing to those interests. We should be. There is great capital in students’ interests and desires for connection, and we are squandering it. (Keith Hamon, in the comments, emphasis mine)
Separate from the institution of school, when you think about learning, doesn't it start with interest? Then why in school do we think we need to start with curriculum and hope that it will generate interest?
My take is different. I see education less as an enterprise in making people do what they don't want to do, and more as one of helping people do what they want to do. (Stephen Downes)
 Exactly.
Stephen is referring to ‘education’ and not to ‘learning’. That word usually indicates that we are talking about the institutions that support learning inside of our culture rather than the broader ‘learning’ that happens as part of being alive. Our education system is always a victim of the need for bureaucratization. It’s terrible… but it’s a necessary evil. (Cormier)
I wonder at the assumption that it's a "necessary evil." I often argue the practical side as well, so I totally get what Dave is saying, but I wonder if we've ever really tried to do it differently? Given the affordances of modern learning (technology, access to information, connectivism, relatively high standard of living - at least in my neck of the world), perhaps we should examine the assumption that 'education' and 'learning' need to be so very different.
I’m suggesting that we need to replace the measurable ‘content’ for the non-counting noun ‘caring’. Give me a kid who’s forgotten 95% of the content they were measured in during K-12 and I will match that with almost every adult i know. Give me a kid who cares about learning… well… then i can help them do just about anything. We simply don’t need all that content, and even if we do need it, we don’t have it anyway . . . We currently have ‘this student has once proved they knew tons of stuff’ as our baseline for ‘having an education’. That’s dumb. (Cormier)
Exactly.
If you have a second, Dave, check out Matthew Lieberman’s book Social, particularly Ch.12 where he discusses education. He echoes your point on page p.282 where he writes: “We spend more then 20,000 hours in classrooms before graduating from high school, and research suggests that of the things we learn in school, we retain little more than half of the knowledge just three months after initially learning it, and significantly less than half of that knowledge is accessible to us a few years later.”
Brutal. Yet we continue to double down. (Dave Quinn, in the comments)
I think most of us know this, both intuitively and from experience, yet we continue to "double down." It's like we acknowledge that what we're doing is ridiculous but, hey, it would be really hard to do it differently, so let's just keep doing it.
The Gallup Purdue Index Report picks up where Wellbeing leaves off. Having established some metrics that correlate both with overall personal happiness and success as well as workplace success, Gallup backs up and asks the question, “What kind of education is more likely to promote wellbeing?” They surveyed a number of college graduates in various age groups and with various measured levels of wellbeing, asking them to reflect back on their college experiences. What they didn’t find is in some ways as important as what they did find. They found no correlation between whether you went to a public or private, selective or non-selective school and whether you achieved high levels of overall wellbeing. It doesn’t matter, on average, whether you go to Harvard University or Podunk College. It doesn’t matter whether your school scored well in the U.S. News and World Report rankings . . .
What factors did matter? What moved the needle? Odds of thriving in all five areas of Gallup’s wellbeing index were
  • 1.7 times higher if “I had a mentor who encouraged me to pursue my goals and dreams” 
  • 1.5 times higher if “I had at least one professor at [College] who made me excited about learning” 
  • 1.7 times higher if “My professors at [College] cared about me as a person” 
  • 1.5 times higher if “I had an internship or job that allowed me to apply what I was learning in the classroom” 
  • 1.1 times higher if “I worked on a project that took a semester or more to complete” 
  • 1.4 times higher if “I was extremely active in extracurricular activities and organizations while attending [College]” 
. . . It really comes down to feeling connected to your school work and your teachers, which does not correlate well with the various traditional criteria people use for evaluating the quality of an educational institution. If you buy Gallup’s chain of argument and evidence this, in turn, suggests that being a hippy-dippy earthy-crunchy touchy-feely constructivy-connectivy commie pinko guide on the side will produce more productive workers and a more robust economy (not to mention healthier, happier human beings who get sick less and therefore keep healthcare costs lower) than being a hard-bitten Taylorite-Skinnerite practical this-is-the-real-world-kid type career coach. It turns out that pursuing your dreams is a more economically productive strategy, for you and your country, than pursuing your career. It turns out that learning a passion to learn is more important for your practical success than learning any particular facts or skills. It turns out that it is more important to know whether there will be weather than what the weather will be . . .
. . . The core problem with our education system isn’t the technology or even the companies. It’s how we deform teaching and learning in the name of accountability in education. Corporate interests amplify this problem greatly because they sell to it, thus reinforcing it. But they are not where the problem begins. It begins when we say, “Yes, of course we want the students to love to learn, but we need to cover the material.” Or when we say, “It’s great that kids want to go to school every day, but really, how do we know that they’re learning anything?” It’s daunting to think about trying to change this deep cultural attitude. (Michael Feldstein, emphasis mine)
And there it is. It's a systemic problem, and we depend on that system to create order out of chaos and, of course, for our employment. It truly is daunting to think about trying to change this and yet . . . we should try anyway.

I think Carol Black nails it when she says,
This is when it occurred to me: people today do not even know what children are actually like. They only know what children are like in schools
I think we've forgotten that despite all the good intentions behind the idea of schools, and the fact that good stuff does indeed happen in them, they are terribly artificial constructs. Again, as Black says,
Traits that would be valued in the larger American society –– energy, creativity, independence –– will get you into trouble in the classroom . . .

When you see children who do not learn well in school, they will often display characteristics that would be valued and admired if they lived in any number of traditional societies around the world. They are physically energetic; they are independent; they are sociable; they are funny. They like to do things with their hands. They crave real play, play that is exuberant, that tests their strength and skill and daring and endurance; they crave real work, work that is important, that is concrete, that makes a valued contribution. They dislike abstraction; they dislike being sedentary; they dislike authoritarian control. They like to focus on the things that interest them, that spark their curiosity, that drive them to tinker and explore . . .

But any Maori parent knows that you have to watch a child patiently, quietly, without interference, to learn whether he has the nature of the warrior or the priest. Our children come to us as seeking beings, Maori teachers tell us, with two rivers running through them — the celestial and the physical, the knowing and the not-yet-knowing. Their struggle is to integrate the two. Our role as adults is to support this process, not to shape it. It is not ours to control. 
Last night my wife was talking about one of her first graders who is really struggling with school right now and she said something like, "He doesn't want to do anything he doesn't want to do." That makes us both wonder, "Then why are we making him do it?"

So many of the problems that our children have in school are a result of school itself, not any inherent problem in the children.
So one hypothesis is that American schools are not only assuming the normal developmental window for reading to be too narrow, they’re also placing it too early. In other words, it’s not like expecting all children to take their first steps at the average age of twelve months: it’s like expecting them all to take their first steps at the precocious age of ten months. In doing this you create a sub-class of children so bewildered, so anxious, whose natural processes of physical and neurological development and organization are so severely disrupted, that you literally have no way of knowing what they would have been like if you had not done this to them.
“Grade level standards,” please recall, do not exist in nature; they are not created scientifically, but by fiat. And there has been almost no serious study of cognitive development in children whose learning has not been shaped by the arbitrary age grading of the school system. Finland simply sets its standards at a place where most children will succeed. The U.S. sets them at a place where a really significant percentage will fail. This is a choice. In making it, we may be creating disabilities in kids who would have been fine if allowed to learn to read on their own developmental schedule. (Black)
So what if we stopped making them "do what they don't want to do?" What if we tried helping them do what they want to do?
We totally want to be in the business of helping people do what they want to do. Try it. No really. Just try it. Sit down with a child and help them do what they want to do. (Cormier)
What if we just tried it?

Wednesday, January 07, 2015

What Grade Should They Get?

If you've followed this blog for a while, then you're probably aware that I'm not a big fan of grades. I won't rehash the philosophical underpinnings of why I'd like to get rid of grades, but I thought I'd briefly share three recent examples that I think help illustrate why you might want to rethink the way you grade even if you don't agree with me that we should eliminate them entirely.

One of the big frustrations I have when discussing grades with others (whether that be teachers, students, or parents) is that the argument frequently comes down to an unfounded faith in percent. The argument goes something like this:
  • Well, we have to have grades. (I disagree.)
  • You have to set a cut off somewhere. (Why?)
  • This is the percent the student got, math never lies, so therefore this grade is accurate and fair. (Oh really?)
Recent Example #1
It's toward the end of the semester and a student has an 89.5% in a class. They turn in a review guide and get a 20 out of 20 on it. What happens to their overall grade? Does it go up? Stay the same? Or go down?

The vast majority of folks say it will go up. The answer, of course, is it depends. In this particular case, the grade goes down. Yes, a student who has an 89.5% in the class turns in their review guide assignment like a good student should and gets a 100% on it, yet their grade still goes down.

How is that possible? Well, this teacher weights their grades by category. This assignment falls in the Homework Category which gets a weight of 10%. Because this teacher previously offered some extra credit (which is a whole different blog rant), the student's percentage in the homework category before the review guide was turned in was 105.7%. After turning in her correctly done review guide, her percent in that category drops from a 105.7 to a 105, and her overall grade drops from an 89.5 to an 89.4 (which, for many teachers, is from an A to a B - most teachers in my building will "round up" an 89.5).


In effect, the student is penalized for turning in a perfect assignment. What grade should they get?

Recent Example #2
At the end of the semester a student has an 89.1% in a class out of a total of 2,389 points. What happens to their overall grade if they scored 1 point higher on one single assignment earlier in the semester?

Again, of course, it depends. In this particular case, it would raise their overall grade to 89.815% which, again for most teachers in my building, is probably the difference between a B and an A. Some of you will doubt that 1 point out of 2,389 can raise their grade from an 89.1 to an 89.815, but it can. This teacher weights categories as well, and one of their categories is titled Homework Checks and is worth 10% of the overall grade. Here is the student's scores in that category:


See that Slope Quiz on October 31st that the student scored a 7 out of 8 on? If they had received an 8 out of 8, their category percentage rises to 100%, which increases their overall percentage in the class by 0.715%, from 89.1 to 89.815.

One point, on one quiz, on one day. What grade should they get?

Recent Example #3
Here's a student's percentages in different categories for a particular class:

Homework: 100%
Tests & Quizzes: 88%
Lab Reports: 88%
Participation: 100%
Checkpoints: 85%
Responsibility: 100%
Final Exam: 74%

What grade should this student get in this class?

Well, we could have a long and valuable philosophical discussion about this, but the point of this example is that this student could get two different grades in the same class at my school. How? It depends on what teacher they have and how that teacher weights their categories. Here's what it looks like for three teachers of this class in my building:


And here's what that translates to for the student's percentages in each category:


These teachers all teach the same class. Students are randomly scheduled into their class by the computer. This student could have performed exactly the same and, in one class, received an 89.2% (a B), an 89.5% (probably an A, but possibly a B), or 90.4% (an A), because the teachers choose to weight the categories differently. Oh, and there are two other teachers of this section that grade on total points, so the student would have yet another percentage that we can't determine from this information.

The same student, in the same class, with the same curriculum, at the same school. What grade should they get?



All three of these examples are real, from my school, from the end of last semester, although I did manipulate the overall percentages for effect (but the assignments and student scores on examples 1 and 2, and the teacher weights on all three examples, are real).

So, even if you believe grades are worthwhile (or if you don't believe grades are worthwhile but you have to give them anyway), I would at least ask that you spend a little more time thinking about them. Your computer grade book is mathematically accurate; it computes exactly what you tell it to compute. But that doesn't mean it makes sense. You are the professional, and if you give a grade to a student you should come up with a more thoughtful way to assign that grade than simply relying on a percentage.