thebabyhater
New Member
- Messages
- 50
- Reaction score
- 0
- Points
- 0
I figured I'd ask this because politics and philosophy in general (at least in the USA) seem to be turning more socialist every day. It seems like corporations and big businesses are now seen as some looming enemy run by soulless demons in suits instead of something positive.
I'll share my opinion. As a business owner, I've really gotten fed up with this portrayal of big business as something negative. I see having a large company as something to strive for - everybody should not be equal in terms of wealth, but instead those who work the hardest and the smartest should be rewarded with more wealth (i.e. the heads of successful corporations). I know that's the direction that I'm working toward.
Basically, my long and convoluted question is as follows: how do you feel about big business? Is it the evil tyrant that the current political machine is painting it to be, or is it actually a good thing?
I'll share my opinion. As a business owner, I've really gotten fed up with this portrayal of big business as something negative. I see having a large company as something to strive for - everybody should not be equal in terms of wealth, but instead those who work the hardest and the smartest should be rewarded with more wealth (i.e. the heads of successful corporations). I know that's the direction that I'm working toward.
Basically, my long and convoluted question is as follows: how do you feel about big business? Is it the evil tyrant that the current political machine is painting it to be, or is it actually a good thing?