OT: help with search

jd1008 jd1008 at gmail.com
Mon Nov 2 18:22:38 UTC 2015



On 11/02/2015 11:15 AM, bruce wrote:
> ok...
>
>
> so you have a 'local' site, not a page, and you want to extract/get
> all the links for the 'domain' of the site you're looking at.
>
> you're going to have to have an app/process that crawls the site, and
> generates the links.
>
> there are a bunch of open source stuff to allow you to craft a process
> to do this, depending on your skillset.  (not sure what your dev
> level/skillset is)
>
> you might also have 'plugins' for the browser that will more or less
> generate this kind of data.
>
> webscraping/crawling/links  <<< terms if you need them.
>
> let us know what else you need.
>
>
> On Mon, Nov 2, 2015 at 12:53 PM, jd1008 <jd1008 at gmail.com> wrote:
>>
>> On 11/01/2015 08:01 PM, bruce wrote:
>>> hey...
>>>
>>> is your issue, you have a specific site you can point to, and you want
>>> to get links off the site?
>>>
>>> or is it something else?
>>>
>>>
>>>
>>> On Sun, Nov 1, 2015 at 7:30 PM, jd1008 <jd1008 at gmail.com> wrote:
>>>> I googled for a way to list all items found on a single page.
>>>> What I am searching for is very very specific (in double quotes)
>>>> and only on a specific web site:
>>>> FOr example:
>>>>
>>>> my_Favorite_Site.com: "my specific phrase" -some_word
>>>>
>>>>
>>>> It comes up with a total of 12K hits on that web site.
>>>>
>>>> I need a way list the URL's of all the hits, or find a
>>>> way to easily capture the URL's of all hits without the
>>>> rigmarole of Rightclick on each link and copy url.
>>>>
>>>> Has anyone found a way to accomplish this?
>>>> --
>>>> users mailing list
>>>> users at lists.fedoraproject.org
>>>> To unsubscribe or change subscription options:
>>>> https://admin.fedoraproject.org/mailman/listinfo/users
>>>> Fedora Code of Conduct: http://fedoraproject.org/code-of-conduct
>>>> Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
>>>> Have a question? Ask away: http://ask.fedoraproject.org
>> Just the links, so I can put them in a text file for another program to
>> go through them.
>> --
>>
No.
Not  a local page. It can be any public search engine,
and it can be any specific phrase.
I already provided an example.
But the example does not give me just the raw texts of the links of the 
hits found,
nor does it give all of them in one fell swoop which you could save to a 
text file.




More information about the users mailing list