Multi page link extraction

hello,
is it possible to grab contents from multiple urls with same domain and put the results in the fullrss output?

Example:
Im making a full rss from blog.org, every feed item has something like that:

CONTENT1 | CONTENT2

I wish to grab the contents from each beta.blog.org and show them in the final output.

I tried using:
single_page_link: //a[contains(@href, ‘beta.blog.org’)]

and yes, with blog.org.txt and beta.blog.org.txt, but it shows only the content of first beta.blog.org it founds, in the above case /view/131886.

is the a way to let single_page_link process all the urls it finds?

Thanks

Hi there,

Can you provide a feed URL where you’re seeing this? Usually each feed item will lead to an individual story. The single_page_link directive is not really intended to follow multiple links. Just one that will lead to the main content.

If the source URL is not a feed but a web page with a set of links. Then you probably need to convert that page into a feed, so each link appears as a separate feed item. We have a tool called Feed Creator that can help with that. See http://createfeed.fivefilters.org/ You can then pass the resulting feed to Full-Text RSS.

Best, Keyvan from FiveFilters.org

Im using fulltextfeed from a rss and
i have

CONTENT1 | CONTENT2

in each feed item.

Im trying to extract the contents from all beta.blog.org/view/***** and get them in the final rss output per item feed

Nris