Read cookies, write to hosts file

C

Char Jackson

Is there an app out there that can be launched and will read the
contents of my (Firefox) cookies and offer to add relevant entries to
my hosts file?

It would need to run under Win 7 64bit, but probably wouldn't be
limited to this OS.
 
P

Paul

Char said:
Is there an app out there that can be launched and will read the
contents of my (Firefox) cookies and offer to add relevant entries to
my hosts file?

It would need to run under Win 7 64bit, but probably wouldn't be
limited to this OS.
So what exactly would the algorithm be ?

If you disable cookie storage entirely, then web pages won't
necessarily render fully. Web pages waste most of their code,
ensuring advertising views are logged appropriately, so the web
site gets paid. And that can be gated by the successful execution
of their javascript, the storage of cookies, that sort of thing.

If you need "a good hosts file", there was at least one web site
offering such a thing, for filtering out nuisance references. But
as in the previous paragraph, that can lead to web pages being
half rendered, news sites with no columns of text (but all the adverts
showing).

So what exactly would the objective be, with respect to hosts ?
How many site removals would be enough, and how many would be
too much ? If such an algorithm existed, a person could become
very rich :)

If you want to capture the *entire* log from your Internet connection,
you can do that with Wireshark, then go through the file later at your
leisure. Doesn't require building a plugin for Firefox or anything.
But it will require human judgment, as to which entries make sense to
add to hosts, and which are better off left out.

At least one particular evil company, makes an infinite number of
names for itself, and you can never expect to filter out everything
they can throw at you.

Paul
 
C

Char Jackson

So what exactly would the algorithm be ?
Parse the local hosts file
Parse the local cookie store
Remove duplicates
Offer to add hosts entries for the non-dupes
If you disable cookie storage entirely, then web pages won't
necessarily render fully. Web pages waste most of their code,
ensuring advertising views are logged appropriately, so the web
site gets paid. And that can be gated by the successful execution
of their javascript, the storage of cookies, that sort of thing.
I don't want to disable cookies.
If you need "a good hosts file", there was at least one web site
offering such a thing, for filtering out nuisance references. But
as in the previous paragraph, that can lead to web pages being
half rendered, news sites with no columns of text (but all the adverts
showing).
I don't want someone else's hosts file.
So what exactly would the objective be, with respect to hosts ?
How many site removals would be enough, and how many would be
too much ? If such an algorithm existed, a person could become
very rich :)
The algorithm is simple. I could fire up VB and do it myself, but
figured I'd ask first.
If you want to capture the *entire* log from your Internet connection,
you can do that with Wireshark, then go through the file later at your
leisure. Doesn't require building a plugin for Firefox or anything.
But it will require human judgment, as to which entries make sense to
add to hosts, and which are better off left out.
I don't want to use Wireshark or build a browser plugin.
At least one particular evil company, makes an infinite number of
names for itself, and you can never expect to filter out everything
they can throw at you.
Cool, thanks.
 
P

Paul in Houston TX

Don't forget the flash cookies (if you use Macromedia/Flash).
Few people know about them.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top