Multi page link extraction

is it possible to grab contents from multiple urls with same domain and put the results in the fullrss output?

Im making a full rss from, every feed item has something like that:


I wish to grab the contents from each and show them in the final output.

I tried using:
single_page_link: //a[contains(@href, ‘’)]

and yes, with and, but it shows only the content of first it founds, in the above case /view/131886.

is the a way to let single_page_link process all the urls it finds?


Hi there,

Can you provide a feed URL where you’re seeing this? Usually each feed item will lead to an individual story. The single_page_link directive is not really intended to follow multiple links. Just one that will lead to the main content.

If the source URL is not a feed but a web page with a set of links. Then you probably need to convert that page into a feed, so each link appears as a separate feed item. We have a tool called Feed Creator that can help with that. See You can then pass the resulting feed to Full-Text RSS.

Best, Keyvan from

Im using fulltextfeed from a rss and
i have


in each feed item.

Im trying to extract the contents from all***** and get them in the final rss output per item feed