Jobs that should be among the best paying in America


The best paying jobs in America should be those involving human services, not doctors, necessarily, but nurses, social workers, and state benefits coordinators. Why, I often wonder, are teams paid millions upon millions of dollars to do a job they love but which helps no one? Yes, they scored a touch-down but that will not feed, clothe, house, and educate the millions of Americans who go without these things every day. Something is wrong with a society which condones paying billions for entertainment and then tightens the purse strings when it comes to compensate individuals whose sole occupation is to provide assistance to the public. - Perry Hall, MD

I feel the the best paying jobs should be the ones that have the most positive effect on the world around them. Doctors should be paid well, as well as those who practice natural medicine and eastern healing. Scientists and those who work in fields dedicated to creating more green ways of living deserve more pay, as well as those who use sustainability in their chosen fields. All in all, I believe that your pay should equate to your benefit to the  health and happiness of the society around you, and the world around you. - Farmington, NM

The best paying jobs in America should be the ones that help humanity, that save lives and protect the people.  These would be jobs like Doctors and other medical personnel, Teachers and Professors, Police and Firefighters and Scientists.  Their minimum salary should be $100,000 a year with all the benefits.   These careers could easily be paid that amount if salaries were made more reasonable for actors, athletes and politicians who all make a ridiculous amount of money and do much less for humanity than the workers named in the above paragraph.  The ones listed above do their work with a desire to help others and make lives better.  The ones who get paid the most now as listed in this paragraph do it for self-glory and prestige.  And they get paid more!  It makes no sense. - Frankfort, KY

The best paying jobs in America should be the jobs nobody wants to do. For instance, who want's to be a garbage man? Undertaker? Exterminator? Not many people want jobs that are in their opinion, distasteful. These jobs should pay more because the ones wanting them are few, far, and in-between. It takes a very special type person to do these jobs but someone has to do it. Right? Where would we be if no one picked up the trash for let's say a month? These types of very special people deserve higher pay. We don't even like to be behind a garbage truck let alone putting garbage in it. There are people working hard around the clock to insure us of good living standards. People we may even take for granted. Too many of us have never thought too much about them. All of us have a job to do that makes a difference in our world. All are important. Some unpleasant but necessary. Let us not forget the people behind the menial jobs working hard for all of us. - Locust, NC


The best paying jobs in America should be teaching. Adults can work anywhere, and children must go to school to learn to become adults. When our culture rewards these teachers the same way they reward the money managers and the corporate ceo's, then our country will flourish with a new generation of scientists, inventors, physicians and historians. These new professionals will have care for every aspect of our increasingly complex, ever more globalized world realities. When we are skimpy with their teachers we are assuring that students do not learn and their lives will cost us their upkeep in prison. Sociologists instruct the state to build prisons based on third grade reading scores. There is no investment more important than the education of the next generation who will govern and problem solve for all of us at the height of their careers. - Chicago, IL

I think the most under rated and under paid job around the world is a  teacher. A teacher/master have the biggest task and jobs to complete by teaching the youth of a  nation. Teachers are hard working and among the most devoted people in the world. They work year around and mainly do there job because there passionate about helping other not how much they get paid. That said, there are good and bad people working in the education system of the country. There have been scandals with teachers having affairs with younger children. That said, the good ones still deserve to be well paid. - San Jose, CA

The best paying jobs in America should be the jobs that no one wants to do.  The easiest that comes to mind is garbage man or janitor.  Collecting the garbage is essential to daily living in America and it is not a job many people line up for.  Janitors clean public areas which allow all of us to enjoy them and again no one is lining up for that job. - Morrisville, PA

The best paying jobs in America should be teachers and health workers. They teach the youth of our country and mold them into bankers, law enforcement, realtors, and so forth. They are the core to what kids grow up to be. If nobody wants to be a health worker, it will be hard to find good nurses, those who will take care of the elderly and sacrifice a lot of hours. It is true that those in the financial sectors make more money and they are also partly to blame for the financial mess we are currently in. That is why any jobs that positively contributes in a selfless way to future generation should be in the list of the best paying jobs in America. The problem is, it will never happen and those who reall do the good deeds will often be forgotten and the good money will go to those who don't often deserve it. - FL