Mozilla flips the default switch on Firefox tracker cookie blocking

From today Firefox users who update to the latest version of the browser will find a pro-privacy setting flipped for them on desktop and Android smartphones, assuming they didn’t already have the anti-tracking cookie feature enabled.

Mozilla  launched the Enhanced Tracking Protection (ETP) feature in June as a default setting for new users — but leaving existing Firefox users’ settings unchanged at that point.

It’s now finishing what it started by flipping the default switch across the board in v69.0 of the browser.

The feature takes clear aim at third party cookies that are used to track Internet users for creepy purposes such as ad profiling. (Firefox relies on the Disconnect list to identify creepy cookies to block.)

The anti-tracking feature also takes aim at cryptomining: A background practice which can drain CPU and battery power, negatively impacting the user experience. Again, Firefox will now block cryptomining by default, not only when user activated.

In a blog post about the latest release Mozilla says it represents a “milestone” that marks “a major step in our multi-year effort to bring stronger, usable privacy protections to everyone using Firefox”.

“Currently over 20% of Firefox users have Enhanced Tracking Protection on. With today’s release, we expect to provide protection for 100% of ours users by default,” it predicts, underlining the defining power of default settings.

Firefox users with ETP enabled will see a shield icon in the URL bar to denote the tracker blocking is working. Clicking on this icon takes users to a menu where they can view a list of all the tracking cookies that are being blocked. Users are also able to switch off tracking cookie blocking on a per site basis, via this Content Blocking menu.

While blocking tracking cookies reduces some tracking of internet users it does not offer complete protection for privacy. Mozilla notes that ETP does not yet block browser fingerprinting scripts from running by default, for example.

Browser fingerprinting is another prevalent privacy-hostile technique that’s used to track and profile web users without knowledge or consent by linking online activity to a computer’s configuration and thereby tying multiple browser sessions back to the same device-user.

It’s an especially pernicious technique because it can erode privacy across browser sessions and even different browsers — which an Internet user might be deliberately deploying to try to prevent profiling.

A ‘Strict Mode’ in the Firefox setting can be enabled by Firefox users in the latest release to block fingerprinting. But it’s not on by default.

Mozilla says a future release of the browser will flip fingerprinting blocking on by default too.

The latest changes in Firefox continue Mozilla’s strategy — announced a year ago — of pro-actively defending its browser users’ privacy by squeezing the operational range of tracking technologies.

In the absence of a robust regulatory framework to rein in the outgrowth of the adtech ‘industrial data complex’ that’s addicted to harvesting Internet users’ data for ad targeting, browser makers have found themselves at the coal face of the fight against privacy-hostile tracking technologies.

And some are now playing an increasingly central — even defining role — as they flip privacy and anti-tracking defaults.

Notably, earlier this month, the open source WebKit browser engine, which underpins Apple’s Safari browser, announced a new tracking prevention policy that puts privacy on the same footing as security, saying it would treat attempts to circumvent this as akin to hacking.

Even Google  has responded to growing pressure around privacy — announcing changes to how its Chrome browser handles cookies this May. Though it’s not doing that by default yet.

It has also said it’s working on technology to reduce fingerprinting. And recently announced a long term proposal to involve its Chromium browser engine in developing a new open standard for privacy.

Though cynics might suggest the adtech giant is responding to competitive pressure on privacy by trying to frame and steer the debate in a way that elides its own role in data mining Internet users at scale for (huge) profit.

Thus its tardy privacy pronouncements and long term proposals look rather more like an attempt to kick the issue into the long grass and buy time for Chrome to keep being used to undermine web users’ privacy — instead of Google being forced to act now and close down privacy-hostile practices that benefit its business.