AI is incredible of course. But combining different tools and use cases to a fully automated approach is even more exciting.
As a first test project I like to systematically scrape websites, interpret their content by standardized prompts, save some pictures to a specific file structure and put the results of all agents into a google sheet.
How would you solve this requirement? Which Tools or Agent Environments would fit?
Thanks for any suggestions.
As a first test project I like to systematically scrape websites, interpret their content by standardized prompts, save some pictures to a specific file structure and put the results of all agents into a google sheet.
How would you solve this requirement? Which Tools or Agent Environments would fit?
Thanks for any suggestions.