depending on how you look at it, it was.
the only two countries to come out of WWI stronger were the United States and Japan. I'd like to see this article, because I'm going to guess the author wasn't referring to actual US military involvement.
Most of the powers in WWI got their funding through the opening of lines of credit with Wall Street and other major American banking sectors. It was during WWI that the center of global commerce changed from London to New York. In this sense, the United States won the first world war quite big. The United States knew this too. One of the reasons why Wilson wanted to keep citizens out of the war was because the United States was reaping BIG on war profiteering and involving the US militarily would severely dampen those war profits. Unfortunately, those big gains would end up backfiring after as all the major European powers would take out so much that they'd have no way of paying it back. This was one of the contributing factors for The Great Depression, since the point at which the Allies were supposed to pay back all their war debts was around 1930.
Japan meanwhile gained HUGE tracts of territory when they got involved because Russia was slowly falling apart. This was the perfect opportunity for Japan to finish its original job back in 1906, and Lenin, trying to keep Russia from descending into anarchy after the revolution, had no choice but to capitulate.
now if the article claims it's a US victory because of their involvement militarily then yes, that's just insensitive and extremely arrogant.