Internet-Draft | Signature-Agent and Max-Crawl-rate for R | June 2025 |
Meunier | Expires 21 December 2025 | [Page] |
This document describes a new directive to allow Signature-Agent (Section 4 of [SIGNATURE-DIRECTORY]) directive in the Robot Exclusion Protocol ([RFC9309]).¶
This note is to be removed before publishing as an RFC.¶
The latest revision of this draft can be found at https://thibmeu.github.io/http-message-signatures-directory/draft-meunier-signature-agent-rep.html. Status information for this document may be found at https://datatracker.ietf.org/doc/draft-meunier-signature-agent-rep/.¶
Discussion of this document takes place on the HTTP mailing list (mailto:ietf-http-wg@w3.org), which is archived at https://lists.w3.org/Archives/Public/ietf-http-wg/.¶
Source for this draft and an issue tracker can be found at https://github.com/thibmeu/http-message-signatures-directory.¶
This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79.¶
Internet-Drafts are working documents of the Internet Engineering Task Force (IETF). Note that other groups may also distribute working documents as Internet-Drafts. The list of current Internet-Drafts is at https://datatracker.ietf.org/drafts/current/.¶
Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress."¶
This Internet-Draft will expire on 21 December 2025.¶
Copyright (c) 2025 IETF Trust and the persons identified as the document authors. All rights reserved.¶
This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must include Revised BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in the Revised BSD License.¶
Bots are increasigly using Signature-Agent as a way to convey identity. As such, there is interest from Origins to define robot policy based on this header. In addition, it'd be ideal if some sample rate limit could be communicated.¶
This documents extends Robot Exclusion Protocol to support these, by extending the user-agent group with new rules.¶
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.¶
One or more user-agent lines that are followed by one or more signature-agents, max-crawl-rate, and rules. The group is terminated by a user-agent line or end of file. The last group may have no rules, which means it implicitly allows everything.¶
Based on the formal syntax defined in Section 2.2 of [RFC9309]¶
group = startgroupline *(startgroupline / emptyline) ; We start with a user-agent line and possibly more *(signatureagentline / emptyline) ; Specification for signature-agent *(maxcrawlrateline / emptyline) ; *(rule / emptyline) ; followed by rules relevant for the preceding lines modifierline = signature-agent-line ; a modifier can either be multiple things, or both. signatureagentline = *WS "signature-agent" *WS ":" *WS directory-token EOL maxcrawlrateline = *WS 1*DIGIT *WS ["/" *WS timeunit] *WS directory-token = DQUOTE "https://" fqdn DQUOTE timeunit = "s"/"m"/"h"/"d"/"w" fqdn = ... ; domain as defined by signature-agent. TBD DQUOTE = "\""¶
Crawlers set their own identity, which is called a directory-agent, to find relevant groups. The directory token MUST contain only lowercase letters ("a-z"), underscores ("_"), hyphens ("-"), and dots ("."). The directory token SHOULD be a valid FQDN suffix of the identification string that the crawler sends to the service. For example, in the case of HTTP [RFC9110], the product token SHOULD be a substring in the Signature-Agent header. The identification string SHOULD describe the public cryptographic key material of the crawler. Here's an example of a Signature-Agent HTTP request header with a link pointing to a page describing the purpose of the crawler.example.com crawler, which appears as a suffix in the Signature-Agent HTTP header and as a directory token in the robots.txt directory-agent line¶
+==========================================+==============================+ | Signature-Agent HTTP header | robots.txt signature-agent | | | line | +==========================================+==============================+ | Signature-Agent: crawler.example.com | signature-agent: example.com | +------------------------------------------+------------------------------+¶
The max-crawl-rate directive specifies the maximum number of requests that a robot SHOULD make to the origin server, for the group it applies to. Well-behaved agents are expected to comply by limiting their request rate accordingly. This directive does not enforce technical access restrictions, and adherence is voluntary. Servers MAY monitor agents behavior and take measures if necessary to protect resources.¶
Signature-Agent group shares the security consideration of [RFC9309]. In addition, given Signature-Agent MAY present a domain name identifying crawlers public cryptographic key material, implementors should treat the content of signature-agent line as possibly sensitive.¶
This document has no IANA actions.¶
Signature-Agent: example.com Allow: *¶
Not in the draft yet. We don't want to incentivise not rotating public keys.¶
Signature-Agent: poqkLGiymh_W0uP6PZFw-dvez3QJT5SolqXBCW38r0U Disallow: /path/to/resource¶
TODO acknowledge.¶