HawkInsight

  • Contact Us
  • App
  • English

OpenAI Sora: Challenges to Reality

AI companies do seem to be racing to create our collective online disinformation problem terminal。

openai-sora-poses-reality-challenge

As we all know, online disinformation is a huge problem - a problem that arguably tears communities apart, rigs elections, and causes certain groups around the globe to lose their minds.。Of course, no one seems particularly concerned about how to actually fix this。In fact, the tech companies most responsible for online disinformation (and, therefore, the institutions most capable of taking steps to solve the problem) seem to be doing everything they can to make the problem worse.。

Case One: OpenAI launched its new text-to-video generator, Sora, on Thursday。The model is designed to allow web users to generate high-quality AI videos only with text prompts。The app is currently wowing the internet with its various bizarre visual images - whether it's a Chinese New Year parade, a person running backwards in the dark, a cat on a bed, or two pirate ships hovering in coffee cups.。

Despite OpenAI's stated mission to "change the world," arguably the company's greatest contribution to the internet has been the instantaneous generation of countless terabytes of digital junk.。All the tools the company has opened up and made public are content generators, and experts warn that they are likely to be used for fraud and disinformation campaigns.。

In a blog post about Sora, OpenAI's team publicly acknowledged that their new app could have some potential negative effects.。The company says it is working on some watermarking techniques to tag content created by its generators and is reaching out to people with knowledge of the situation to figure out how to mitigate the ecological toxicity of the inevitable AI-generated trash that Sora will bring.。Sora is not currently open to the public, and OpenAI says it is creating systems to deny users the ability to generate violent or pornographic images.。The statement states:

We will engage with policymakers, educators and artists around the world to understand their concerns and identify positive uses for this new technology.。Despite extensive research and testing, we can't predict how people will use all the benefits of our technology or how people will abuse it.。

The framing of this problem is a bit comical, as it is already completely obvious how OpenAI's new tools can potentially be abused.。Sora will generate fake content on a huge scale - this is obvious。It is likely that some of this content will be used for online political disinformation, some may be used in various frauds and scams, and some may produce harmful content.。OpenAI says it wants to set meaningful limits on violent and pornographic content, but internet users and researchers have shown how smart they are at cracking AI systems to generate content that violates the company's usage policies.。All this Sora content is clearly going to flood into social media channels, making it harder for ordinary people to tell the difference between real and fake and making the internet in general more annoying.。I don't think it would take a global panel of experts to figure out。

There are some other obvious disadvantages。First, Sora - and other similar tools - can have an adverse impact on the environment.。Researchers have shown that text-to-image generators are worse than text generators in terms of environment, and that just creating an AI image requires the same amount of energy as filling your smartphone.。Second, new text-to-video generation technology is likely to hurt the economy of video creators, because why should companies pay people to produce visual content when all it takes to create a video is the click of a button??

Disclaimer: The views in this article are from the original Creator and do not represent the views or position of Hawk Insight. The content of the article is for reference, communication and learning only, and does not constitute investment advice. If it involves copyright issues, please contact us for deletion.