Future Utopia Topic - We Can't Fix Our Society or Civilization Until We Fix Ourselves

Indeed, I have a quote that I remind people in our think tank quite often and it goes something like this; "we don't ever want to create the perfect society and civilization because the humans will come around and mess it up the next day. Rather what we should do is make everything 95% perfect and give all the little humans something to work on, after they've seen the progress created thus far."
Now then, why do you suppose I say that? Well, there is another quote that I use often; "the great thing about running a think tank is that you have job security, because there is an unending number of problems in the world, and the humans continually run around and create more." Thus, when I talk to people about some futuristic utopia, I also warn them about how "the road to hell is often paved with the greatest intentions," and to "be careful what you wish for."
Next, I'd like to point out that we cannot fix our civilization or society until we fix ourselves. It really is humans that cause all the commotion, challenges, and problems. Evolution didn't cause the problems, as it's been solving problems for millions of years. Each time society advances, or our technology gets better, there are always unintended consequences. It's much like making a new rule, after you make one rule there's just too much incentive to keep making rules to shore up all the loopholes, or challenges that you create along the way.
From a philosophical standpoint you could say that all complex systems are like this, and it is a constant struggle between chaos and large organizations regardless of the domain. Okay so, what do we need to fix amongst humans you ask? Well, that's a very good question, and one that deserves an answer. Perhaps the first and foremost thing that I see wrong is how often we sweep the dirt under the carpet, and hide the reality behind political correctness. You can't solve real problems using false scenarios, or with fake input.
You see, it wouldn't matter if you had the world's best supercomputer which was fully self-actualized, and running the best artificial intelligent software ever created in the present period or in the future. Bad data in, means you get bad information out. Some say that humans need to all get on the same page to accomplish something great to make the world perfect. And yet, as I've said above, even if the world was perfect, who is to say that the angry masses in the middle of the night wouldn't come and destroy it? After all, hasn't history shown that to be? Indeed I hope you will please consider all this and think on it.