While data privacy still remains one of TikTok's biggest challenges, it may face a larger problem in order to stay in the United States: content moderation.
Challenges over harmful content and misinformation aren't TikTok's alone. Previously, leaders from Meta, Twitter, Google, and more have faced the ire of Congress, defending their apps' policies in a showdown between free speech and dangerous materials.
"There has always been interest in regulating the space," said Emile El Nems, Moody's vice president and senior credit officer. The question that we face is how do we go about it? And how easy is it to be implemented?"
Concerns over China's ability to spy on U.S. citizens through TikTok kicked off CEO Shou Chew's testimony in front of Congress on Thursday. But many congressional members used their time to grill the executive on the actual content on the platform. Several disturbing posts shown as examples encouraged gun violence, promoted eating disorders, and provided inaccurate medical advice.
While Chew was emphatic that these kinds of content were in violation of TikTok's policy, he also noted that the issues were not unique to his company. He also noted it was investing in artificial intelligence and other tools to prevent harmful posts.
"You gave me only generalized statements that you're investing, that you're concerned, that you're doing work. That's not enough for me. That's not enough for the parents of America," said Rep. Diana DeGette (D-Colo. 1st District)

Non-profit ParentsTogether campaign director Shelby Knox also agreed TikTok's efforts to prevent harmful content are not working.

"They actually have the audacity to say that they are doing enough," Knox said. "Yet, when children's deaths over the past few months have been attributed to TikTok challenges, they're absolutely not doing enough. And the idea that parents have the tools and the resources to go up against a billion-dollar company who was looking for their child for ad dollars is laughable and offensive."
However, she admitted it's a problem facing all apps that allow for user generated content.
"Social media across the board is dangerous because it is unregulated," Knox said. "They are issuing tech all of the time that is untested and experimenting on our kids. It is dangerous. When we talk about regulating social media platforms, there is no 'we can get rid of one,' and that will be the worst, and that will save more kids. This is across the board a danger to kids." 
Legislation could also slow down revenue opportunities for the social media sector, which has tech companies worried. Already many digital advertising-reliant companies are seeing marketing budgets constrict due to concerns over an impending recession and a weak economy.
"If legislation impedes growth, then multiples will have to contract," said Moody's El Nems. "And if multiples contract that means the stock prices will have to come down from a credit profile from the fixed income point of view. If growth slows down, then the question becomes will the profitability be impacted?"
Add to that uncertainty over how to enact regulation. Existing laws like Section 230 give tech companies immunity over content posted by users on their platforms. Initially, the legislation was meant to help the growing Internet and bloggers. However, as the online world evolved and user content is posted at lighting speed, many believe laws have become antiquated. Still, verbalizing that things need to change in Washington, DC, is easier said than done.
"They themselves are having a difficult time now if we can legislate and pass a law that changes the calculus," El Nems said. "but for now, it's very difficult to make an assumption that it will be changed — and how will it be changed?