I don't think that will be seen that way. The FTC is looking at if it would appeal to kids as well as if it is aimed at kids. The FTC's examples of things they would consider indicates something would appeal to kids includes stuff like if there are animated characters on the website which I hope they don't but suspect they may carry over to videos too. Most animation will appeal to kids even if it isn't aimed at them (I'm not sure about the art style of shows like Archer, Bojack Horseman or Ugly Americans but most animation would - Bojack might still just due to being animals).
YouTube is also kind of concerning in this regard if they don't have an actual person doing this stuff as their algorithm doesn't work very well regarding this sort of thing - Charlie the Unicorn got the first part of its finale this year and I re-watched the old ones on my partner's Youtube to refresh my memory of them and YouTube was recommending toilet training videos alongside it as well as nursery rhymes. I've had comedy songs with animated music videos trigger recommendations for nursery rhymes before too.
Regardless of YouTube's algorithm issues, it's the FTC that is ultimately doing this and they were stated as focusing on if it would appeal to kids, not just if it is aimed at kids, which is concerning as that will cover a lot of content.
Mind you, this might not be the biggest problem. YouTube's just released some newer guidelines that will be current in December that covers them to remove accounts deemed to not be profitable. This could just be an easy way to deal with people who paint the platform poorly in a quick manner (which might mean less future adpocalypses) but it could be quite a problem and lead to accounts being removed because they don't like their views or because they aren't monetisable due to either being advertiser unfriendly or being kid's content which can't be advertised on now or any other reason. It depends on what the intention is behind this line. It's a good line to cover themselves to respond quickly to the worst situations (but really they don't do that anyway, they manually reviewed the suicide forest video and kept it up before people kicked up a stink enough for them to remove it - meanwhile punishing Pewdiepie much worse for something that wasn't good but was still nowhere near as bad as what Logan Paul did), but it's a concerning term they have put in there. I hope it's just for the worst case scenarios and not for giving them an easier out to remove any account they want.