By continuing to use the site or forum, you agree to the use of cookies, find out more by reading our GDPR policy

FLoC is a recent Google proposal that would have your browser share your browsing behavior and interests by default with every site and advertiser with which you interact. Brave opposes FLoC, along with any other feature designed to share information about you and your interests without your fully informed consent. To protect Brave users, Brave has removed FLoC in the Nightly version of both Brave for desktop and Android. The privacy-affecting aspects of FLoC have never been enabled in Brave releases; the additional implementation details of FLoC will be removed from all Brave releases with this week’s stable release. Brave is also disabling FLoC on our websites, to protect Chrome users learning about Brave. Companies are finally being forced to respect user privacy (even if only minimally), pushed by trends such as increased user education, the success of privacy-first tools (e.g., Brave among others), and the growth of legislation including the CCPA and GPDR. In the face of these trends, it is disappointing to see Google, instead of taking the present opportunity to help design and build a user-first, privacy-first Web, proposing and immediately shipping in Chrome a set of smaller, ad-tech-conserving changes, which explicitly prioritize maintaining the structure of the Web advertising ecosystem as Google sees it. For the Web to be trusted and to flourish, we hold that much more is needed than the complex yet conservative chair-shuffling embodied by FLoC and Privacy Sandbox. Deeper changes to how creators pay their bills via ads are not only possible, but necessary. The success of Brave’s privacy-respecting, performance-maintaining, and site-supporting advertising system shows that more radical approaches work. We invite Google to join us in fixing the fundamentals, undoing the harm that ad-tech has caused, and building a Web that serves users first. The rest of this post explains why we believe FLoC is bad for Web users, bad for sites, and a bad direction for the Web in general. FLoC harms privacy directly and by design: FLoC shares information about your browsing behavior with sites and advertisers that otherwise wouldn’t have access to that information. Unambiguously, FLoC tells sites about your browsing history in a new way that browsers categorically do not today. Google claims that FLoC is privacy improving, despite intentionally telling sites more about you, for broadly two reasons, each of which conflate unrelated topics. First, Google says FLoC is privacy preserving compared to sending third-party cookies. But this is a misleading baseline to compare against. Many browsers don’t send third-party cookies at all; Brave hasn’t ever. Saying a new Chrome feature is privacy-improving only when compared to status-quo Chrome (the most privacy-harming popular browser on the market), is misleading, self-serving, and a further reason for users to run away from Chrome. Second, Google defends FLoC as not privacy-harming because interest cohorts are designed to be not unique to a user, using k-anonymity protections. This shows a mistaken idea of what privacy is. Many things about a person are i) not unique, but still ii) personal and important, and shouldn’t be shared without consent. Whether I prefer to wear “men’s” or “women’s” clothes, whether I live according to my professed religion, whether I believe vaccines are a scam, or whether I am a gun owner, or a Brony-fan, or a million other things, are all aspects of our lives that we might like to share with some people but not others, and under our terms and control. FLoC adds an enormous amount of fingerprinting surface to the browser, as the whole point of the feature is for sites to be able to distinguish between user interest-group cohorts. This undermines the work Brave is doing to protect users against browser fingerprinting and the statistically inferred cohort tracking enabled by fingerprinting attack surface. Google’s proposed solution to the increased fingerprinting risk from FLoC is both untestable and unlikely to work. Google proposes using a “privacy budget” approach to prevent FLoC from being used to track users. First, Brave has previously detailed why we do not think a “budget” approach is workable to prevent fingerprinting-based tracking. We stand by those concerns, and have not received any response from Google, despite having raised the concerns over a year ago. And second, Google has yet to specify how their “privacy budget” approach will work; the approach is still in “feasibility-testing” stages. Google is aware of some of these concerns, but gives them shallow treatment in their proposal. For example, Google notes that some categories (sexual orientation, medical issues, political party, etc.) will be exempt from FLoC, and that they are looking into other ways of preventing “sensitive” categories from being used in FLoC. Google’s approach here is fundamentally wrong. First, Google’s approach to determining whether a FLoC cohort is sensitive requires (in most cases) Google to record and collect that sensitive cohort in the first place! A system that determines whether a cohort is “sensitive” by recording how many people are in that sensitive cohort doesn’t pass the laugh test. Second, and more fundamental, the idea of creating a global list of “sensitive categories” is illogical and immoral. Whether a behavior is “sensitive” varies wildly across people. One’s mom may not find her interest in “women’s clothes” a private part of her identity, but one’s dad might (or might not! but, plainly, Google isn’t the appropriate party to make that choice). Similarly, an adult happily expecting a child might not find their interest in “baby goods” particularly sensitive, but a scared and nervous teenager might. More broadly, interests that are banal to one person, might be sensitive, private or even dangerous to another person. The point isn’t that Google’s list of “sensitive cohorts” will be missing important items. The point, rather, is that a “privacy preserving system” that relies on a single, global determination of what behaviors are “privacy sensitive,” fundamentally doesn’t protect privacy, or even understand why privacy is important. Visit OUR FORUM for more.

A timely reminder has been shared of how the current global chip famine has affected processor prices, in this case specifically for the AMD Ryzen 9 5950X. While retailers who have tried to stay close to MSRP are invariably out of stock, those with Ryzen 9 5950X CPUs to sell are mostly setting astronomical price tags for the Zen 3 powerhouse. Those looking to snag a 16-core, 32-thread AMD Ryzen 9 5950X for a reasonable price will already be aware of how difficult a task that has become. The 2021 global chip shortage, caused by a combination of the coronavirus pandemic, companies shifting to a work from home strategy, and previously unpredictable rocketing demand, has led to much-wanted PC parts, especially high-end units like the Ryzen 9 5950X CPU and GeForce RTX 3090 GPU, being sold at greatly inflated prices. A recent Reddit post by a Redditor called locutusuk68 has triggered quite a discussion on the popular social website on this processor-pricing theme, with an accompanying screenshot revealing how the UK retailer Overclockers is currently selling the top-end Zen 3 processor for a staggering £959.99 (US$1,316/AUD$1,726). The MSRP for the Ryzen 9 5950X AMD is US$799, while PC builders in the UK may have expected to pay in the region of £750 (US$1,028/AUD$1,349) for the chip. In fact, one of the country’s largest electronics retailers, Currys, has the 16-core part listed for that fair price along with a price match guarantee. Of course, it’s out of stock. Shopping around does not really deliver much relief, because those stores that look like they might offer reasonable deals may either be unfamiliar (Box - £849.99) or have incredibly limited stock (CCL - £899). A listing on eBay for multiple units of the Ryzen 9 5950X has a “buy it now” offer at £1,085.49 (US$1,488/AUD$1,952) per part, while a retailer called OnBuy takes the biscuit with a price tag of £1,099.95 (US$1,508/AUD$1,978). In fact, just for added shock value, there is even a mention of AMD’s Ryzen 9 5950X being priced at an insane £1,480.72 (US$2,030/AUD$2,662). Of course, this same discouraging picture for desktop DIYers exists in other markets: Best Buy also has a price match guarantee for the Zen 3 part at US$799 but is sold out, and if you take a look at Amazon there is sometimes stock listed as available – but in some cases, you have to be willing to part with US$1,288.99. However, retailers that are reliant on low unit sales are just utilizing an age-old business tactic of price hiking when demand exceeds supply. An accusatory finger can be pointed at Team Red, but did AMD really reckon on a million Ryzen 5000 unit sales within a few weeks of release? Supply is apparently ramping up, so arguably the best thing desktop PC builders can do right now is holding on. Eventually, supply will catch up with demand and prices will fall…or Zen 4 might even be around by the time that happens. Follow this and more by visiting OUR FORUM.

Today, Google launched an “origin trial” of Federated Learning of Cohorts (aka FLoC), its experimental new technology for targeting ads. A switch has silently been flipped in millions of instances of Google Chrome: those browsers will begin sorting their users into groups based on behavior, then sharing group labels with third-party trackers and advertisers around the web. A random set of users have been selected for the trial, and they can currently only opt-out by disabling third-party cookies. Although Google announced this was coming, the company has been sparse with details about the trial until now. We’ve pored over blog posts, mailing lists, draft web standards, and Chromium’s source code to figure out exactly what’s going on. EFF has already written that FLoC is a terrible idea.  Google’s launch of this trial—without notice to the individuals who will be part of the test, much less their consent—is a concrete breach of user trust in the service of a technology that should not exist. Below we describe how this trial will work, and some of the most important technical details we’ve learned so far. FLoC is supposed to replace cookies. In the trial, it will supplement them. Google designed FLoC to help advertisers target ads once third-party cookies go away. During the trial, trackers will be able to collect FLoC IDs in addition to third-party cookies. That means all the trackers who currently monitor your behavior across a fraction of the web using cookies will now receive your FLoC cohort ID as well. The cohort ID is a direct reflection of your behavior across the web. This could supplement the behavioral profiles that many trackers already maintain. As described above, a random portion of Chrome users will be enrolled in the trial without notice, much less consent. Those users will not be asked to opt-in. In the current version of Chrome, users can only opt-out of the trial by turning off all third-party cookies. Future versions of Chrome will add dedicated controls for Google’s “privacy sandbox,” including FLoC. But it’s not clear when these settings will go live, and in the meantime, users wishing to turn off FLoC must turn off third-party cookies as well. Turning off third-party cookies is not a bad idea in general. After all, cookies are at the heart of the privacy problems that Google says it wants to address. But turning them off altogether is a crude countermeasure, and it breaks many conveniences (like single sign-on) that web users rely on. Many privacy-conscious users of Chrome employ more targeted tools, including extensions like Privacy Badger, to prevent cookie-based tracking. Unfortunately, Chrome extensions cannot yet control whether a user exposes a FLoC ID. FLoC calculates a label based on your browsing history. For the trial, Google will default to using every website that serves ads — which is the majority of sites on the web. Sites can opt-out of being included in FLoC calculations by sending an HTTP header, but some hosting providers don’t give their customers direct control of headers. Many site owners may not be aware of the trial at all. This is an issue because it means that sites lose some control over how their visitors’ data is processed. Right now, a site administrator has to make a conscious decision to include code from an advertiser on their page. Sites can, at least in theory, choose to partner with advertisers based on their privacy policies. But now, information about a user’s visit to that site will be wrapped up in their FLoC ID, which will be made widely available (more on that in the next section). Even if a website has a strong privacy policy and relationships with responsible advertisers, a visit there may affect how trackers see you in other contexts. For complete details visit OUR FORUM.