I've been sitting back and consuming all the information I can and thinking about this whole thing.
Here are some things some people have had to say:
Mark Pilgrim (who started the debate)
The original problem
I think it's all a bunch of crap.
However, I think it's a good idea to have liberal parsers.
Sound like I'm confused? Not really. I'm leaning towards what Joe has discussed before, about allowing the data to be parsed, but alert the user.
The real debate is that if we allow bad feeds through, why would anyone take the time to fix them... but if we don't let them through then the user can't read them at all, even if there is useful data.
So I propose a compromise. We let the data through if it's bad. But we alert the user, and send email notifications to the webmaster/editor fields in the RDF file. That's what they're there for right? I can pretty much guarantee that if a website has 500 readers, and the feed is broken, and they get 500 emails saying that there's a problem... they'll fix it.
It seems like a very obvious solution, and maybe I'm missing something.
I'm sure someone is going to complain that it seems like a mean thing to do, but I think that it seems like the very obvious choice.
So that's my opinion.
This was also posted to The aggregators group