So a way to think about society is a network of stakeholders with different interests. You the individual have your own interests. The IRS has it's interests. The competitor has their interests. Your data sharing policy should be set up to protect your interests but this does not mean you must protect your interests at the expense of for example the IRS or some legal regulatory agency.
Your interests could be that you want to reduce your own regulatory risk, your financial risks, your legal risks, etc. The IRS or whomever has their own mission. The rational solution would be to set your data sharing policy in such a way to reduce your risks (including your risk of being a focus of IRS). The easiest way to do this is to simply develop a technical means where you can prove to the IRS that you've paid your fair share of taxes on your transactions (even if you keep your transactions shielded).
How is this possible? Well that is one of the promises of advanced cryptography such as homomorphic encryption. The idea that we can keep our data encrypted and still compute it. The IRS example is probably a bad one because the tax code is so hellishly complex that implementing something like that would be difficult but for many examples such as reputation scores and peer matching algorithms you can get major benefits keeping all the data encrypted.
A peer matching algorithm for example could bring people together who don't know anything about each other but whom the machines know intimately. This would actually solve a real social problem using the privacy preserving technology which can't be solved with more transparency. Transparency can't make people trust each other or bring people together. Transparency also doesn't detect lies from truth. Big data can be analyzed to get to know a billion people and then a matching algorithm can connect the right people to each other.