I know it will be frowned upon to ask this but are the drug companies 100% bad?

I think we have huge problems with healthcare in America (mostly that it's crazy expensive) and I think drug companies are very much to blame. I just am curious if there is anyone in this sub that doesn't think the solution is universal government healthcare. Is there anyone out there? I'd love to hear your thoughts.

The fact that the drug companies can make a profit off of these medications has done a few good things. Some people research better medications and cures because they want to help people. But profit is also a motivator. People complain about the insulin that is available for cheap but that's all that was available years ago. When the stuff we have now has it's patent run out and it's cheap in the future, people will complain that it isn't as good as whatever the new stuff is. Guess what? The new stuff wouldn't have been invented if they weren't going to be able to charge more for it. It's all more complicated than that but there is a reason that America has invented the majority of the effective diabetes treatments in the last 50 years. We do have a lot to be thankful for. :) Bring on the down votes.