Not. Same old same old, if you ask me: Germans see ties worsening as Americans remain positive.
In the U.S., seven-in-ten say that relations with Germany are good, a sentiment that has not changed much in the past year. Germans, on the other hand, are much more negative: 73% say that relations with the U.S. are bad, a 17-percentage-point increase since 2017.
Americans want more cooperation with Germany, but Germans don’t reciprocate.

Do Germans EVER see anything getting better? Seriously, I know they aren’t as gloomy as Russians, but they don’t exactly have a reputation for optimism.