johndrinkwater

Humans.txt & hostmeta

written by John Drinkwater, on

In a similar vane to robots.txt, some aspiring people have thought the best way to codify authorship of sites is placing a /humans.txt on the server, structured as just plain text…

But it’s soo under‐specced it makes me cringe. No default encoding, no format, no way to register future fields in use, a fixed file name located in the root.

We like to do things well, that is why we want to provide guidelines for a standard humans.txt, Abel Cabans defined the basic fields that you can consult here.

You are also free to add the one you want to.

From http://humanstxt.org/

Which is all well and good, until you consider there is no structure to the format, at all. I reckon I should put all my entries in humans.txt inside cowsay blocks.

I sent them a message, hopefully they’ll get back to me:

@johndrinkwater @humanstxt did you look at YAML for the format? or consider a default encoding? (UTF-8) or look into RFC5785 for file placement?

From http://identi.ca/conversation/62138855,

Props to Tim Bray for pointing me to the project.

@timbray "We are humans not machines": http://humanstxt.org/ But should be /.well-known/humans.txt, see draft-hammer-hostmeta

From http://twitter.com/timbray/status/29236420900028416