top of page
Writer's pictureThe Banner

What Do Sports Teach You?

Updated: Mar 30, 2021

FREDDY BANDA, Staff Writer


I was on the football team last year, but not this year. Last year, when school started, it seemed different because I missed a whole week of practice and a game. I came back from an injury, then the coach spoke to me about how to be a better teammate. He said I needed to come to practice every week, even during summer practice; there were no days off when you are trying to get better as a team. That was was something I had to learn the hard way. The coaches said I could work as a manager who takes out the water and takes it back after practice, but I chose not to.


Sports teach you about responsibility. Responsibility means you have to show up, be on time for practice, and speak positive language. Teamwork is important because you have to get along with people. You’re not going to like everyone you work with, but you have to find a way to get along with that person. This is an important skill in life because it will help you accomplish more and succeed. Football showed me how to be positive and it taught me about leadership in my everyday life.



53 views

Recent Posts

See All

Comentários


bottom of page