robotstxt - A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
Last updated 8 days ago
crawlerpeer-reviewedrobotstxtscraperspiderwebscraping
10.45 score 68 stars 6 packages 358 scripts 3.2k downloadsreadMDTable - Read Markdown Tables into Tibbles
Efficient reading of raw markdown tables into tibbles. Designed to accept content from strings, files, and URLs with the ability to extract and read multiple tables from markdown for analysis.
Last updated 2 months ago
datamarkdownmarkdown-parsermarkdown-table
5.18 score 2 stars 1 packages 2 scripts 267 downloads