Thanks, but this didn't really solve my problem. If you are running a service that will display IPFS content (such as steemit) then one cannot simply "avoid downloading a hash", it happens at user request.
The maximum storage size is a global amount. I wish to limit the maximum FILE size. I could store 100 GB of 2MB images but not 300 MB movies or 3MB songs.
Also I would rather not have a web request trigger a 1GB download from IPFS.
It depends on how you run your service. If you enable a http-gateway of your own, then yes it is a hard job to filter that content. But you have that same problem with regular http. If users run their own ipfs node, you don't store any content.
You should be able to estimate the size of a hash by getting the object and counting the amount of links in the object. Number of links times the max data size should give a (very) rough size indication. But close enough to distinguish between small and large files.
ipfs file ls ipfs/path
to get the size of the individual files.try
ipfs object stat <hash>