Consider this example for a hypothetical outsourcing vendor. This company has decades worth of contracts, statements of work, white papers, blog posts, and PowerPoint presentations that it has used to market to and engage with clients and potential clients. It wants to point ChatGPT at that content and then use the chatbot for a variety of purposes—to generate fast responses to requests for proposals, for example, or to summarize the firm’s thinking on various topics when its executives are asked to make presentations at conferences. The company wants to know that once the chatbot has had access to this proprietary data it won’t allow that data to be used unfairly by the company’s competitors. Or, where reuse would be acceptable, the company may want to ensure that it is credited, or even compensated, for that reuse. It also wants to be sure it’s not giving away any trade secrets or violating any confidentiality agreements it may have signed with clients.
If all of those pieces of IP were transformed into NFTs with embedded smart contracts and stored on a blockchain, the company could flag each with codes telling ChatGPT which bits could be used freely, which could be used only with attribution, which only with permission, and which only with the payment of royalties or some other form of compensation. ChatGPT would need only simple tweaks to recognize these codes.
Alternatively, imagine this same company decides to use a generative AI chatbot to create a proposal for a client, only to learn that a key component of the proposal largely mirrors concepts developed and perhaps copyrighted by one of its competitors—potentially inviting a lawsuit. If we were in a world where most organizations are storing and tagging their IP on a blockchain before making it available to generative AI chatbots, the potential for infringing on someone else’s intellectual property would go down dramatically.
Companies are wrestling with these concerns right now. They want to be first movers and they want to create new value for their clients and customers. At the same time, they don’t want to create risks they may not be able to manage. Some have already ended up in court. Early this year, for example, stock photo company Getty Images sued Stability AI, maker of the AI-based image generator Stable Diffusion, over alleged copyright violations. Getty argued in a statement to the press that Stability AI had “unlawfully copied and processed millions of images protected by copyright” to train its software for its own commercial benefit.1 Getty is seeking up to $1.8 trillion in damages.2
How claims of copyright violations or misappropriation of IP might play out in court are largely unknown at this point given how new the field of generative AI is. Almost certainly, it will take years before the courts catch up to the technology.
Companies want to be first movers and they want to create new value for their clients and customers. At the same time, they don’t want to create risks they may not be able to manage.
This news is republished from another source. You can check the original article here