Adaptive Capital
By Josh Fairhead • 12 minutes read •
Tokenomics Proposal: “Adaptive Capital“
Background
We are essentially proposing an ecosystem level partnership between Regen Network and Protocol Labs through the mutual funding of a Tokenomics DAO. This comes from our teams desire to bridge the gap between research and implementation.
The proposal below outlines the current state of our implementation plan and we are soliciting discussion and feedback for further advancement. There is plenty of scope for improvement, which we trust will happen as we advance our work. We believe that the proposal submitted to PL can help lay strong research foundations; while Regen is fertile soil for technical implementation that contributes to grounded impact in real communities. Specifications
Regen Registry defines Nature Based Solutions in the following manner:
Actions to protect, regeneratively manage and restore natural or modified ecosystems that address societal challenges effectively and adaptively, simultaneously providing human well-being and biodiversity benefits”.
“Adaptive Capital” is subsequently a proposition for valuing living ecosystems through multi-perspective architecture and the free energy principal; a means of evaluating quantitive measures with qualitative outcomes. For such a project to find traction, we need to consider the existential domains of markets, science and technology; while holding in mind the essential values of people, planet and protocols. Let’s explore our assumptions about these domains:
Market Assumptions Markets are seeking pathways for divestment from degenerate commodities to more regenerative investment vehicles. However the divestment is constrained by a lack of said pathways and it’s estimated that there are two trillion dollars of stranded assets trading on Wall St. books. In the authors opinion this number represents a transformation opportunity thats global in scale provided we can address the markets pain points with integrity. Please see the RFP-X Research doc for a more detailed contextual framing on this subject.
Scientific Assumptions When it comes to addressing the markets with integrity, we need to incorporate truth claims and anchor this context. The methods used and the claims made are important; Popper believed science was about the raw explanatory power and predictive capacity of a framework, while Khun thought of it as a community of practice.
In our view both authors are correct; reason can thought of as rational proof (deductive knowledge utilising the Baconian methodology) and as empiric verification (inductive wisdom through the Goethian methodology). Consequentially the ReSci movement might consider integrating both these fundamental approaches to making truth claims by leveraging abductive reasoning and Bayesian methods. These base predictions on known priors before feeding forward to infer likely outcomes.
Technological Assumptions From a practical standpoint the story of digital communication begins with Claude Shannons information theory; binary 1’s and 0’s that convey state. Beyond Shannon’s fundamental theory of bits and bytes, there are new innovations that abstract his binary domain into a trinary lens of “agents, languages and perspectives” where agents, speaking their language of preference, express perspectives on ‘Truth claims’.
As a technical architecture, such a ‘meta ontology’ enables us to encode other frameworks as ‘perspectives’, and in particular we believe that encoding “general systematics” is a particularly powerful epistemological framework that can act as a functional backbone for articulating parameter groupings in a transparent way.
Applied at the interface level these functional frameworks can be used to generate a consistent set of active inference models that enable the composable encoding of practice based methodologies, eco-credits, accounting methodologies, evaluation criteria, sensor networks APIs, DAOs and far beyond. Such architecture is particularly elegant as it enables semantic translation and ad hoc composibility for an apples to oranges comparison of quantitive measures to qualitative values. Infrastructure Design
Let’s frame the high level design abstractions we will lean on in relation to our ends: These could more sensibly be framed as:
(+) Qualitative values
(=) Systematised Markov blankets
(-) Quantitive measures
Markov blankets act as a form of general substrate that can then be further systematised with metadata to enable semantic interoperability - this is a highly general medium that is analogus to merkle trees and hash graphs. In Regens case this technical infrastructure is represented by the data module.
An Impact certificate essentially articulates a given methodology in a machine readable format. These certificates represent a set of means, while evaluation criteria codified in a similar format can express values as ends. For example - when it comes to open science - there is a growing movement towards making papers more accessible by encoding them as ‘conversation graphs’; this might be represented in a tetradic geometry with nodes representing:
Directive: Source
Instrumental: Question
Ground: Claim
Ideal: Proof
Extrapolating such a concept one step further it’s easy enough to imagine a form of bespoke facilitation where the content of any conversation can be functionally represented as geometries to support the processes of communal understanding.
When a discussion reaches consensus on credit class, methodology, DAO design, evaluation criteria etc. we can use the interface to easily store and anchor data on chain. Any subsequent dialogue can be self-managed by the community, easily updating their credit design parameters through their governance process through a simple interface. Supporting conversation and data can be aggregated as a form of composable capital to be issued as eco-credits which would democratise access to technology in a significant way. Such a process can even be turned into art or gamified to make the process fun and rewarding!
Below the interface level comes the functional specifications for on chain integration. Using the AD4M ontology we can integrate general systematics to describe a set of metadata standards. From there we can describe both the data module and eco-credit modules as ‘languages’ and ‘perspectives’ in order to make them interoperable with other systems architectures. For added precision, we may wish to add other ontological ‘perspectives’ for translating quantity (data module claims like MRV) to quality (eco-credit attestations like REA).
In the case of the latter Resource, Event, Agent accounting is a particularly useful ontology to integrate as its institutionally recognised accounting standard for distributed supply chains. This might then use feeds from Monitoring, Reporting and Verification technologies as evaluation criteria for impact certificates. The elegance of this design is that if these standards are changed or extended, they can be easily recomposed or updated.
For example we can extend the triadic geometry MRV to be a tetradic geometry for a more detailed resolution. The following was articulated by dMeter community members:
Directive: Human-sensing (Personal data stores, phone images, self reported data)
Instrumental: Remote-sensing (IOT sensors, devices)
Ideal: Sky-sensing (satellite, drones, aerial)
Ground: Reputation-sensing (identity, community, trust)
Ultimately whatever indicators are at desirable can be articulated through General Systematics and codified as Markov blankets to unlock the capabilities of semantically composable impact markets. These functional models can then be enriched with Active Inferencing to provide a more detailed depth of resolution. Pilot Tokenomic Mechanism(s): Adaptive Capital
The architecture above is designed at the level of general infrastructure and can be used for composing any desirable credit design, which is dependent on the articulation of parametric groupings instantiated around a use case or pilot project. These paramaters and the rules governing them matter significantly when scaled to the level community, commons or bioregion.
There are already number of great projects in the Regen ecosystem that would stand to benefit from such composable designs and we’re keen to partner based on DAO capacity. LunarPunk Labs as proposal authors have a strong preference towards working closely with our existing partners at the charity Regenerating Sonora in order to build out a low risk pilot project. Conditions around them are exceptionally vital as they have LEHR gardens, a community centre with a blockchain learning desk and high school tech club working with Arudino sensors! Basic Credits: Regenerating Sonora use case
Starting from the LEHR gardens, possibilities are assessed by a student group.
It’s decided that the key step forward is to automate the gardens water timers.
A group of local high school students automate the timers using their Arduino kits.
Using the systematics interface Arduino data feeds are mapped as a ‘perspective’.
These are then allocated to a ‘language’ describing the storage mediums (e.g. Filecoin Green).
As the students add other sensors to augment the garden timers new feedback loops arise.
These changes to the model are assessed using active inferencing; do the feedback loops suggest the gardens are growing better or worse with the new automation?
Documented processes reporting improvements drives up the credit value. Assuming the local school holds a portion of tokens they stand to benefit from any speculative activity on their work.
Community Capital: H3 University use case
An example methodology our students might use to additionally codify the co-benefits produced by the communities vibrant social capital is the H3uni framework deployed on the current course “The art of creative collaboration”. This particular practice based methodology groups eight sets of parameters under four framing conditions.
Framing conditions:
Directive: Process
Instrumental: Method
Ideal: Task
Ground: Context
Paramater Groupings:
Self awareness
Empathetic appreciation
Group dynamics
Interactive skill
Cognitive framing
Visualisation skill
Perceiving qualities
Mental repatterining
This particular methodology is currently being demonstrated as both the H3uni course with video documentary evidence AND is being discussed amongst our teams on WeCo.io as a ‘weave’ (chains of 1min audio recordings / 140 character cards in a sequence as a conversation graph). The data from this form of ‘prosocial-computation’ can all be anchored on a storage medium of choice using the Regen Data Module. You can keep up with the teams pretotype testing here. Teams as Viable Systems DAO
Purpose :: Pilot Implementation :: Regenerating Sonora
Intelligence :: Logistics & Coordination :: LunarPunk_Labs
Operations :: Development :: WeCo
Coordination Task :: Design :: VIZN_Labs
Resourcing :: Research :: Sovereign Nature Initiative
Pathways
(+) Applied Science: Place based communities
Purpose: Field learning and environmental stewardship
Intelligence: Practice based methodologies
Operations: Onboarding pilot projects and facilitation
Coordination Task: Methodology Development
Resourcing: Lab learning as rational understanding
(-) Social Technology: Systematic verification Infrastructure
Purpose: Impact evaluators as quality assurance from REA value accounting
Intelligence: Academic research and conversation graphs as attestations
Operations: Hypercerts making use of general systematics and active inference
Coordination Task: Interfaces and conversation graphs as a source of claims
Resourcing: Impact certificates as sources of quantitive measurements from MRV
(=) Impact Markets: Operationalised DAO
Purpose: Advocacy and narrative as transformative change
Intelligence: Legal and regulatory movements
Operations: Events, logistics and investments
Coordination Task: Marketing and communications
Resourcing: Accounts, fundraising and economics
Project exposure
Refi Spring
GR15 socialisation
Gathering of the Tribes
Refi DAO founders circles
Labs Week 2022 socialisation
Future Quest grant top 100 grantee
GItcoin Verification Infrastructure bundle
— Roadmap and Budget Meta
Lead: LunarPunk_Labs or other brand as an implementation DAO
Serve: H3uni.org as an educational consortium of methodology developers
Support: Protocol Labs and Regen Network in the development of semantic bluefunds through applied research on evolving practical language models through second order methodologies, interfaces and semantic protocols.
Milestone one: Functional Specifications
Start Date: 2023-04-06 Deliverables:
Submitting a Research proposal to Protocol Labs in parallel to this RFP
Presentation on state of affairs shared with community partners and date set for sense making
Public sense making session in ‘consortium’ format
Working with Implementation partners to align on functional specifications
Public presentation of engineering documentation to Regen Community and Network Goods team
Submission is made to the network via an on chain vote with two weeks deliberation time
Proposal passes or fails based on validator determinations
Budget:
$35k Research budget from Protocol Labs to LunarPunk_Labs
Token grant from Regen Foundation to community partners
Milestone Two: Tokenomics DAO Design
Start Date: 2023-07-17 Deliverables:
Convene discussions around DAO design and Tokenomic Mechanism and share in a group context
Practice the design and document the results with digital technology
Presentation of DAO pilot programme provided to community partners
Working with place based community facilitators to align on practices and methodologies
Onboarding documentation compiled and shared online
Community completes discussion DAO design as engineering on a pilot project begins
Updates submitted to Commonwealth, Protocol Labs and Regen community
Budget:
$25k Cash grant from RND inc. to LunarPunk_Labs
Token delegation from Regen Foundation to community partners
Milestone Three: Pilot Project(s)
Articulate the precise eco-credit methodology and share in a group context
Connect with the Regen Registry team while applying the methodology technique, documenting the results with digital technology
Presentation of pretotype results provided to pilot partners
Working with place based community facilitators to align with and codify their practice
Documentation published on their precise method
Submission of methodology submitted for peer review
Expert review via the Regen Registry team or third party verifier
Budget:
200k REGEN tokens with 1 year lockup from RND inc. to implementation partners
Milestone Four: Tokenomics csDAO
Budget:
500k locked REGEN tokens to initiate the csDAO structure.
Contributions from Protocol Labs to sustain traction
Milestone Five: Engineering
Interface design for functionally encoding community conversations
Describing general systematics as an AD4M perspective
Integrating Regen’s data module as an AD4M language
Integrating Regen’s eco-credit module as an AD4M perspective
Budget: TBD Milestone Six: Perfected Designs
Anchoring partner data on chain with the Regen data module
Minting eco-credits for pilot project partners
Budget: TBD Milestone Seven: Quality Control
Issuing general systematics as a semantic credit class upon Registry Governance sign off.
Budget: TBD