We have 5,000 PDFs that should, in total, be no larger than 200gb. They are likely to be required to be updated throughout the year in batches of around a 1,000.
As I see, there are two main routes...
1) Publish the PDF and associated metadata through Tridion 2) Import directly into the delivery environment and manage the PDF metadata in Tridion
A compelling (business) reason for putting these PDFs through the CMS is the route to getting them on production - CMS=Easy - non-CMS=Not easy at all and the control it gives directly to the business.
We certainly would prefer to manage metadata directly associated with the binary item and also take advantage of component linking (to track where used etc.) rather than mapping components (for the metadata) with "links" to a non-CMS controlled binary item - so it seems to me through the CMS would make more sense.
Now - there's the question of bloating the database / blocking the publishing queue...
Some of these items may need go through workflow (if we batch upload through WebDAV I presume we could define specific cartridges for specific folders and therefore associate different schema?). However - using WebDAV would presumably mean the PDFs (and historical versions of) would be stored in the database which could be problematic.
So... we could link these in Tridion as external link components but I presume this would then mean we couldn't use WebDAV (or could we still use WebDAV with externally_linked PDFs - seems like it doesn't make sense?)
I'm sure large quantities of binaries being managed in (or around) the CMS is something many of us have come across and would be very interested in hearing how others have approached this dilemma?
Thanks