Are all kids taught that their country is the greatest and that they should be proud of their history?
Is it like every parent tells their kid they are beautiful when we all can see that your kid is clearly a nightmare come to life?
USA seems to be battling this right now (always has been?). There are parts of our history that are very shameful and embarrassing and morally, ethically wrong. There are many who think we should gloss over those parts or eliminate them altogether.
There are people living in states like Alabama where there is rampant poverty and the education on offer is rated among the lowest in the country but they will be some of the most vocally proud Americans because they were indoctrinated in public school to believe that.
These people have grown up and are hearing some of this history for the first time and since it's clearly shameful it doesn't reconcile with the pride that was instilled in them so they think it's lies or designed to destroy America's Legacy. So now they want to destroy the federal influence on education so each state can lie to their students in a more tailored way. This is only going to get worse.
How is your country teaching its children about its history?
EDIT: I went to US public schools and I was taught that the USA is the greatest country that has ever existed on Earth. If you are not from the USA, which country were you taught is the greatest ever?