Our businesses, governments and social institutions are all collections of data. The use and control of this data is what makes civilization. In the past most value transfer in society was physical. Today that value is digital.
The important difference between the virtual and physical is the cost of replication. With a few keystrokes we can duplicate nearly any digital asset. The value of a digital asset depends on who has access and control of its use. Recent development of privacy enhancing technology and non fungible tokens can now allow us to share digital assets while controlling their use and distribution.
The quantities of data we produce as a digital society has surpassed our ability to interpret. This has led to the creation of machine learning and artificial intelligence. These algorithms can parse our data, separating signal from noise. The value of these predictive results are the most valuable digital assets, as they allow us to peer into the future.
The market is fragmented. The owners of the data are often not experts in AI/ML. The experts in these technologies do not have access to the data. The data is restricted by technology, regulation and fear. If providers give access to their data, it can be copied resulting in the loss of value.
What would benefit all involved is the ability to treat virtual assets like physical assets. You should be able to allow use of your data, while maintaining ownership. Once the data has been used, it returns to the owner without exploitation. Code written for data pipelines that allow AI and ML should be protected and the authors should have providence and compensation.
There are now technology stacks that enable this functionality. First is a set of Privacy Enhancing Technologies, PET’s, that allows the data provider to give access to their assets, while granularly controlling its use and replication. The second is using NFT’s and blockchain technology, to control access and to provide a trusted record of transactions between the parties.
The missing component is a system that allows all members of the AI value chain to meet, collaborate and monetize their efforts.
ai.market provides that service. A system where numerous private data stores are accessible using privacy preserving technology. AI creators can offer a percentage of the revenue derived from their models, to the providers of the data and other value contributors of the model creation. ai.market provides the tools, access and the distribution of funds.
A project creator negotiates in advance, to a revenue share. ai.market enforces this agreement. It provides the front end API’s to sell access to the model. In this way the data providers can monetize their data without fear of losing control. The AI model creator gets access to extensive private data, to train the model. The end customer receives reliable, high speed access to the models via API’s.
The entire system is automated and managed by ai.market, who custodians the models and provides payments to the contributors. Details about the models are provided in a market setting, including a reputation system and an economic incentives to improve the models and the sources of data.
ai.market provides economic incentives to its VADERS to improve the quality of the data pipelines. This allows the creator of value such as coders, evaluators and facilitators to directly monetize their contributions. We solve an issue endemic in the open source world.
Those that monetize open source are operators, not creators. In the market, a contributor receives compensation directly from the market, anytime their contribution is used.
Each data set is listed in a standard format that can be easily identified. We provide this from the ai.market homepage and allow it to be searched using an internal or external search engine.
The second is a standard set of records, that provide the format and all necessary details to create a machine learning algorithm. A comment and reputation system will allow you to view what others experience with this source of data has been and their respective qualifications.acy Enhancing Technologies (PETs).
The final is the mechanism by which payment for access to the data source, can be automated and called upon demand. The automation includes a token that acts as an API call. Use of the token provides a blockchain based record of all transactions. This is an immutable record and is available to buyer and seller and regulatory parties. This API token enforces the permissions defined by the data provider and protects the resource from duplication or abuse.
It is expected that predictive models derived from data providers will themselves be monetized through the marketplace. This means that AI organizations, will partner with data providers, to create models that can be sold, with the resulting revenue shared.
The legal frameworks for these transactions are the responsibility of the market.
The marketplace has groups of Vaders, (Value add providers). They can be utilized to provide specific functionality, in the lifecycle of raw data to end user model. The market provides the economic incentive and the tools to engage and compensate the Vaders, but these are autonomous entities. The market has a reputation and trust system that allows users of the market, to evaluate the inclusion of Vaders in delivering market services.