| Internet-Draft | PoP Appraisal | February 2026 |
| Condrey | Expires 18 August 2026 | [Page] |
This document specifies the forensic appraisal methodology and quantitative security model for the Proof of Process (PoP) framework. It defines how Verifiers evaluate behavioral entropy, perform liveness detection, and calculate forgery cost bounds. Additionally, it establishes the taxonomy for Absence Proofs and the Tool Receipt protocol for AI attribution within the linear human authoring process.¶
This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79.¶
Internet-Drafts are working documents of the Internet Engineering Task Force (IETF). Note that other groups may also distribute working documents as Internet-Drafts. The list of current Internet-Drafts is at https://datatracker.ietf.org/drafts/current/.¶
Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress."¶
This Internet-Draft will expire on 18 August 2026.¶
Copyright (c) 2026 IETF Trust and the persons identified as the document authors. All rights reserved.¶
This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must include Revised BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in the Revised BSD License.¶
The value of Proof of Process (PoP) evidence lies in the Verifier's ability to distinguish biological effort from algorithmic simulation. While traditional RATS [RFC9334] appraisals verify system state, PoP appraisal verifies a continuous physical process. This document provides the normative framework for forensic appraisal, defining the logic required to generate a Writers Authenticity Report (WAR).¶
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.¶
A Verifier MUST perform the following procedure to appraisal a PoP Evidence Packet:¶
The appraisal logic is designed to detect "Synthetic Authoring"—content generated by AI and subsequently "back-filled" with timing and hardware attestation.¶
Forgery cost bounds provide a Verifier with a lower bound on the computational resources required to forge an Evidence Packet. The cost (C_total) is computed as:¶
C_total = C_vdf + C_entropy + C_hardware¶
Verifiers MUST include these estimates in the WAR to allow Relying Parties to set trust thresholds based on objective economic risk.¶
Absence proofs assert that certain events did NOT occur during the monitored session. They are divided into categories based on verifiability:¶
When external tools (LLMs) contribute content, the framework enables a "compositional provenance" model:¶
Verifiers appraisal the ratio of human-to-machine effort based on these receipts and the intervening VDF-proved intervals.¶
The security model assumes a "Rational Forger" whose goal is to minimize compute cost while maximizing forensic confidence.¶
High-resolution behavioral data poses a stylometric de-anonymization risk [Goodman2007]. Implementations SHOULD support Evidence Quantization, reducing timing resolution to a level that maintains forensic confidence (detecting robots) while breaking unique author fingerprints.¶
Verifiers MUST NOT automatically reject evidence based solely on atypical timing patterns. Implementations MUST support "Assistive Modes" that adjust SNR and CLC thresholds for authors with motor disabilities or those using assistive technologies (eye-tracking, dictation).¶
This document has no IANA actions. All IANA registrations for the PoP framework are defined in [PoP-Protocol].¶
This document defines forensic appraisal procedures that inherit and extend the security model from [PoP-Protocol]. The broader RATS security considerations [Sardar-RATS] also apply. Implementers MUST consider the following security aspects:¶
An adversary may attempt to inject synthetic jitter patterns that satisfy entropy thresholds while lacking biological origin. Verifiers MUST employ multi-dimensional analysis (SNR, CLC, Error Topology) rather than relying on single metrics. The correlation between semantic content complexity and timing variation provides defense-in-depth against high-fidelity simulation.¶
The forensic assessments defined in this document produce probabilistic confidence scores, not binary determinations. Relying Parties MUST understand that forgery cost bounds represent economic estimates, not cryptographic guarantees. Trust decisions SHOULD incorporate the declared Attestation Tier (T1-T4) and the specific absence proof types claimed.¶
High-resolution behavioral data (keystroke timing, pause patterns) can enable author identification even when document content is not disclosed. Implementations SHOULD support Evidence Quantization to reduce timing resolution while maintaining forensic utility. The trade-off between forensic confidence and privacy MUST be documented for Relying Parties.¶
Adversaries may falsely claim assistive technology usage to bypass behavioral entropy checks. Verifiers SHOULD require consistent assistive mode declarations across sessions and MAY request additional out-of-band verification for mode changes. The WAR MUST clearly indicate when assistive modes were active.¶
The following constraints MUST be verified by conforming Verifiers:¶