I do not understand how teachers have become the villains of society and scapegoats of every politician facing a budget deficit. Teaching is one of the most important professions that exists within an advanced society. Teachers have the responsibility of preparing our youth for the challenges of daily life as American adults.
In other words, teachers are possibly the greatest influence on the direction a person’s life takes. Teachers, in many cases, have more of an influence than parents in this regard. Successful citizens with the ability to be productive members of society directly affect our nation’s future.
As such, teachers should be treated with respect. Instead, we are mocked, degraded and trivialized by every outlet of the media and politician and anyone else who has an opinion about the state of our education system. We are seen as a drain on our nation’s economy. We are seen as selfish. As New Jersey Gov. Chris Christie said, “We are only out for our paycheck.”
How did we get to this point? At what point in our history did teachers become the bad guys? The emphasis on education and the high regard in which teachers were held in the past is part of what made this country great. Is it really any wonder that we are falling behind the rest of the industrialized world now?
Let’s face it: If teaching was so easy, more people would do it. The reality is, the vast majority of Americans are not cut out to teach and just as many of those people have no idea what teaching entails.
If policymakers were truly concerned about a high-quality education for our nation’s children, they would make sure teachers were well-paid and -respected. Maybe then our nation’s best and brightest would go into teaching.