• snooggums@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 days ago

    How would a site make itself acessible to the internet in general while also not allowing itself to be scraped using technology?

    robots.txt does rely on being respected, just like no tresspassing signs. The lack of enforcement is the problem, and keeping robots.txt to track the permissions would make it effective again.

    I am agreeing, just with a slightky different take.