Not if you want to ensure the validity of the compiled coupons/discounts. A custom algorithm would be best but data standardization would be the main issue, regardless of how you process it.
What does validity mean in this case? A functionary LLM can follow links and make actions. I’m not saying it’s not “work” to develop your personal bot framework, but this is all doable from the home PC, with a self hosted llm
Edit and of course you’ll need non LLM code to handle parts of the processing, not discounting that
LLMs are not a good tool for processing data like this. They would be good for presenting that data though.
Llms are excellent at consuming web data.
Not if you want to ensure the validity of the compiled coupons/discounts. A custom algorithm would be best but data standardization would be the main issue, regardless of how you process it.
What does validity mean in this case? A functionary LLM can follow links and make actions. I’m not saying it’s not “work” to develop your personal bot framework, but this is all doable from the home PC, with a self hosted llm
Edit and of course you’ll need non LLM code to handle parts of the processing, not discounting that
Llms are great for scraping data
LLMs don’t scrape data, scrapers scrape data. LLMs predict text.
https://youtu.be/fjP328HN-eY?si=quZeZx57fDjBW5EW
Puppeteer and gpt-vision are decidedly not LLMs
Make an LLM convert the data into a standardized format for your traditional algorithm.
There’s no way to ensure that data will stay in that standardized format though. A custom model could but they are expensive to train.