From manuscript preparation to business-related best practices, scholarly publishers increasingly integrate data capture and analysis into their systems. These efforts are considered essential to enable interoperability, ensure transparency, and build trust with authors, funders, and institutions.
On Thursday, October 20, at the Frankfurt Book Fair, CCC presented The Data Quality Imperative: Improving the Scholarly Publishing Ecosystem. a special Frankfurt Studio session. Watch the recording.
Click below to listen to the latest episode of the Velocity of Content podcast.
Panelists Sybille Geisenheyner, Director of Open Science Strategy & Licensing, American Chemical Society; Dr. Johanna Havemann, Trainer and Consultant in Open Science Communication, and Co-founder & Strategic Coordinator, AfricaArxiv; and Laura Cox, Senior Director, Publishing Industry Data, CCC, all shared insights with me on the impact of consistent data quality strategy on the people, processes, and technology in your organization.
“There needs to be trust in the data, and that is always something we start with,” said Geisenheyner. “The data will never be 100% perfect, but it can get even better. How many articles get published? What is the APC spend on an individual basis? What is the subscription spend? We do have a lot of data. And to share that data is key.”