You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some websites may measure uniqueness of users' environments and/or availability of API prone to fingerprinting and discriminate against users and vendors of tools having proper mitigations.
The fact that a person uses mitigations reveals some information.
this is a combo: 1. makes users and vendors of such tools rare, which increases information in 2.
So while standardization of observable behavior is the best measure to close the leak, some simulation and randomization should be stacked upon it to prevent leaking tye fact that a user is using the discriminated tools.
The text was updated successfully, but these errors were encountered:
Is there a specific example of simulation, randomization or concealing you can provide?
The draft notes randomization as not typically a recommended solution, because it's hard to determine when it's more effective than a standard or null value.
I do think there's an interest in trying to reduce the visibility of privacy-enhancing modes (including incognito or private browsing modes), as in the TAG document here: https://w3ctag.github.io/private-browsing-modes/
Is there additional advice we should provide in developing new specifications to make it so that modes are not detected and discriminated against?
So while standardization of observable behavior is the best measure to close the leak, some simulation and randomization should be stacked upon it to prevent leaking tye fact that a user is using the discriminated tools.
The text was updated successfully, but these errors were encountered: