This document discusses research data management and long-term preservation strategies. It outlines the Open Archival Information System (OAIS) reference model, which provides a framework for describing and comparing digital archive architectures. The OAIS model defines key concepts like the designated community, which is the intended users of preserved data, and representation information, which allows users to understand preserved digital objects. The document also discusses challenges in preserving linked data and outlines the components of a research data management plan.
The document discusses the Common Technical Document (CTD) and electronic CTD (eCTD) formats for submitting information on medicines to regulatory authorities. The CTD format was introduced in 2000 to harmonize dossier submissions across regions. It organizes information into five modules covering administrative data, quality and safety summaries, nonclinical and clinical study reports. The eCTD is the electronic version that allows electronic submission to improve the review process. Requirements for the eCTD include PDF version, formatting, hyperlinks and file naming conventions. Overall, the CTD and eCTD were created to standardize submissions and facilitate review of medicines by regulatory bodies.
Disclaimer:
I have created this document with inputs from various sources. Some are taken right from slideshare. I just try to make this topic little compact and lucid, so that everybody can understand it easily
I am very much thankful to the original authors also, don't think I am just doing plagarism.
Regulations for drug approval in USA, E.U & India
Pharmaceutical industry is the most regulated of all the industries. Regulations are put in order to develop the most efficient and safe pharmaceutical products. It takes more than 8 to 15 years to develop a new drug product & costs more than $ 800 million.
The document compares the regulatory processes for drug product submissions in the US and EU. In the US, applications are submitted to the FDA's Center for Drug Evaluation and Research and can be New Drug Applications or Abbreviated New Drug Applications. In the EU, applications are submitted through national regulatory authorities or through the centralized European Medicines Agency process. The key differences between the US and EU processes include differences in application types, approval timelines, post-approval change requirements, manufacturing standards, quality testing standards, and facility inspection processes.
The document discusses the Common Technical Document (CTD), which provides a standardized format for submitting documentation to regulatory authorities for approval of pharmaceutical products. It describes the evolution and adoption of the CTD internationally and in India. The CTD aims to streamline the drug approval process and facilitate simultaneous reviews by different regulators. It is organized into five modules covering administrative information, summaries, quality, nonclinical data, and clinical data. Widespread use of the CTD format has allowed for greater harmonization and efficiency in global pharmaceutical development and regulation.
The document provides information about regulatory affairs and the registration process for pharmaceutical products. It defines key terms like dossier, DMF, CTD/eCTD, and regulatory filing types. It describes the modules of a common technical document for a product registration, including the administrative, summary, quality, non-clinical and clinical sections. Electronic submission using eCTD format is also discussed. Country-specific regulatory agency websites are listed.
Baseline submissions provide a single eCTD submission that captures all information reflecting previously submitted and approved marketing licenses for a drug. This consolidates multiple past eCTD submissions per strength into a single eCTD lifecycle. Benefits include removing redundant information, improving organization, and enabling reuse of documents across products. However, baseline submissions are time-consuming to create and can expose gaps or outdated information that was acceptable in past submissions but may not meet current requirements.
This document discusses research data management and long-term preservation strategies. It outlines the Open Archival Information System (OAIS) reference model, which provides a framework for describing and comparing digital archive architectures. The OAIS model defines key concepts like the designated community, which is the intended users of preserved data, and representation information, which allows users to understand preserved digital objects. The document also discusses challenges in preserving linked data and outlines the components of a research data management plan.
The document discusses the Common Technical Document (CTD) and electronic CTD (eCTD) formats for submitting information on medicines to regulatory authorities. The CTD format was introduced in 2000 to harmonize dossier submissions across regions. It organizes information into five modules covering administrative data, quality and safety summaries, nonclinical and clinical study reports. The eCTD is the electronic version that allows electronic submission to improve the review process. Requirements for the eCTD include PDF version, formatting, hyperlinks and file naming conventions. Overall, the CTD and eCTD were created to standardize submissions and facilitate review of medicines by regulatory bodies.
Disclaimer:
I have created this document with inputs from various sources. Some are taken right from slideshare. I just try to make this topic little compact and lucid, so that everybody can understand it easily
I am very much thankful to the original authors also, don't think I am just doing plagarism.
Regulations for drug approval in USA, E.U & India
Pharmaceutical industry is the most regulated of all the industries. Regulations are put in order to develop the most efficient and safe pharmaceutical products. It takes more than 8 to 15 years to develop a new drug product & costs more than $ 800 million.
The document compares the regulatory processes for drug product submissions in the US and EU. In the US, applications are submitted to the FDA's Center for Drug Evaluation and Research and can be New Drug Applications or Abbreviated New Drug Applications. In the EU, applications are submitted through national regulatory authorities or through the centralized European Medicines Agency process. The key differences between the US and EU processes include differences in application types, approval timelines, post-approval change requirements, manufacturing standards, quality testing standards, and facility inspection processes.
The document discusses the Common Technical Document (CTD), which provides a standardized format for submitting documentation to regulatory authorities for approval of pharmaceutical products. It describes the evolution and adoption of the CTD internationally and in India. The CTD aims to streamline the drug approval process and facilitate simultaneous reviews by different regulators. It is organized into five modules covering administrative information, summaries, quality, nonclinical data, and clinical data. Widespread use of the CTD format has allowed for greater harmonization and efficiency in global pharmaceutical development and regulation.
The document provides information about regulatory affairs and the registration process for pharmaceutical products. It defines key terms like dossier, DMF, CTD/eCTD, and regulatory filing types. It describes the modules of a common technical document for a product registration, including the administrative, summary, quality, non-clinical and clinical sections. Electronic submission using eCTD format is also discussed. Country-specific regulatory agency websites are listed.
Baseline submissions provide a single eCTD submission that captures all information reflecting previously submitted and approved marketing licenses for a drug. This consolidates multiple past eCTD submissions per strength into a single eCTD lifecycle. Benefits include removing redundant information, improving organization, and enabling reuse of documents across products. However, baseline submissions are time-consuming to create and can expose gaps or outdated information that was acceptable in past submissions but may not meet current requirements.
Presentation on data sharing that outlines five layers that must be addressed to enable data to be located, obtained, access, understood and use, and cited.
Presentation focused on processes used to analyze content for reuse and suggests criteria for rewrite. Includes the content audit and process, content context and an audit that provides a content snap-shot in time.
A description of the W3C's Profile Guidance, Profiles Vocabulary & Content Negotiation By Profile for the Australian & New Zealand Land Information Council (ANZLIC)'s Metadata Working Group
This talk addresses two questions: “How can the quality of taxonomies be defined?” and “How can it be measured?” See how quality criteria vary depending on how a taxonomy is applied, such as automatic content classification in ecommerce or a knowledge graph for data integration in enterprises. Distinguish between formal quality, structural properties, content coverage, and network topology. Investigate the advantages of standards-based and machine-processable SKOS taxonomies to be able to measure the quality of taxonomies automatically, as well as several tools and techniques for quality assessment.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Metadata Quality Assurance Part II. The implementation beginsPéter Király
This document outlines a metadata quality assurance framework. It discusses why data quality is important, what the framework can be used for, and its key principles. It then describes how metadata quality will be measured, including examining schema-independent structural features, use case scenarios, and cataloging known metadata problems. Specific discovery scenarios and their metadata requirements are provided as examples. The document concludes by outlining further steps to develop and implement the framework.
A Data Curation Framework: Data Curation and Research Support ServicesSusanMRob
The document presents a data curation framework to help align research support services with university eResearch needs. The framework has four components: data processing, storage, archiving, and research data management. It identifies current support services around information literacy, collections access, and scholarship. The framework is intended as a tool to highlight areas for collaboration between research support services and eResearch providers as services evolve over time.
Data continues to grow exponentially – especially with the advent of social content. Approximately 70% of data is unstructured. This impacts on storage costs and management, Data Protection, and SLAs.
New deployment options such as cloud provide alternatives but how do you know what you should move to the cloud?
A presentation given as part of the DC101 training course run by the DCC at Oxford University in June 2010. The course provided data management guidance for researchers.
This document provides an update on the W3C Dataset eXchange Working Group (DXWG). It discusses some issues with the existing Data Catalog Vocabulary (DCAT) specification and how various application profiles have extended DCAT to address these issues. It outlines the mission and deliverables of DXWG, which include revising the DCAT recommendation, providing guidance on publishing application profiles, and explaining how to implement content negotiation by application profile. The document also discusses use cases and requirements considered by DXWG and how "minimal ontological commitment" is guiding the reworking of DCAT. It presents early ideas for how to describe application profiles and provides links to engage with the ongoing work of DXWG.
We presented these slides at the NIH Data Commons kickoff meeting, showing some of the technologies that we propose to integrate in our "full stack" pilot.
Team Argon proposes a commons platform using reusable components to promote continuous FAIRness of data. These components include Globus Connect Server for standardized data access and transfer across storage systems, Globus Auth for authentication and authorization, and BDBags for exchange of query results and cohorts using a common manifest format. Together these aim to provide uniform, secure, and reliable access, transfer, and sharing of data while supporting identification, search, and virtualization of derived data products.
This document discusses data accessibility and challenges. It covers the data life cycle, including planning, generating data, reliability, ownership, metadata, versioning, and publishing. It discusses expectations for accessing and sharing data. Open access data policies are encouraged by research funders, journals, and initiatives like DataCite to assign identifiers to research data. Data can be shared through repositories, journals, websites, or informally between researchers. Factors that affect sharing and accessing data include size, computing needs, standards, repositories, data nature, governance, and metadata.
Business System Architecture for Publishing IndustryAtul Pawar
This document outlines a publishing systems architecture to integrate product enablement processes. It proposes integrating around content management and workflow, preferring third-party components and building capabilities in a product and content-neutral manner. The strategy focuses on managing content across the product lifecycle through acquisition, creation, enrichment, transformation, and delivery, with an emphasis on collaboration, transparency and reducing silos.
Moving mountains with Sharepoint - Document Management with SharePoint 2013Oliver Wirkus
This presentation shows how to implement a Document Management System to an existing SharePoint Intranet, Best practices on how to start that kind of projects and on Document Management with SharePoint
Ty Molchany - Information Remediation After Mergers & Acquisitions: An Auto-C...ARMA International
A large global Pharmaceutical company expanded its product portfolio through mergers and acquisitions. However, this resulted in the company acquiring huge amounts of regulatory data and information that did not have proper classification, metadata, or retention periods. This Case Study is a real world example of the approach taken by the organization to streamline how it identified, analyzed, classified, remediated, and properly disposed of information.
Expressing Concept Schemes & Competency Frameworks in CTDLCredential Engine
This presentation is focused on how the Credential Engine can access 3rd party resource data stores and recipes for mapping and publishing competency frameworks as Linked Data.
This document provides an overview of a workshop on good practice in research data management held at the University of Tartu, Estonia. The workshop covered various topics including defining research data, research data management and data management plans, organizing and documenting data, file formats and storage, metadata, security, and sharing and preserving data. The workshop was led by Stuart Macdonald from the University of Edinburgh and included presentations, introductions, and discussions around each of these research data management topics.
Clinical data transparency - EMA Policy 0070Qdossier B.V.
This document provides an overview of EMA Policy 0070 on the publication of clinical data for medicinal products. It discusses the redaction of personal protected data and commercially confidential information from clinical reports. It also outlines the technical requirements and submissions process for proposed and final redaction packages.
What is Lean Authoring?
Why Lean Authoring?
Advantages of Lean Authoring
How to apply Lean Authoring
Lean Authoring General
Lean Authoring Module 3
Real life examples
More Related Content
Similar to How document templates can facilitate or prohibit multipurpose documents
Presentation on data sharing that outlines five layers that must be addressed to enable data to be located, obtained, access, understood and use, and cited.
Presentation focused on processes used to analyze content for reuse and suggests criteria for rewrite. Includes the content audit and process, content context and an audit that provides a content snap-shot in time.
A description of the W3C's Profile Guidance, Profiles Vocabulary & Content Negotiation By Profile for the Australian & New Zealand Land Information Council (ANZLIC)'s Metadata Working Group
This talk addresses two questions: “How can the quality of taxonomies be defined?” and “How can it be measured?” See how quality criteria vary depending on how a taxonomy is applied, such as automatic content classification in ecommerce or a knowledge graph for data integration in enterprises. Distinguish between formal quality, structural properties, content coverage, and network topology. Investigate the advantages of standards-based and machine-processable SKOS taxonomies to be able to measure the quality of taxonomies automatically, as well as several tools and techniques for quality assessment.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Metadata Quality Assurance Part II. The implementation beginsPéter Király
This document outlines a metadata quality assurance framework. It discusses why data quality is important, what the framework can be used for, and its key principles. It then describes how metadata quality will be measured, including examining schema-independent structural features, use case scenarios, and cataloging known metadata problems. Specific discovery scenarios and their metadata requirements are provided as examples. The document concludes by outlining further steps to develop and implement the framework.
A Data Curation Framework: Data Curation and Research Support ServicesSusanMRob
The document presents a data curation framework to help align research support services with university eResearch needs. The framework has four components: data processing, storage, archiving, and research data management. It identifies current support services around information literacy, collections access, and scholarship. The framework is intended as a tool to highlight areas for collaboration between research support services and eResearch providers as services evolve over time.
Data continues to grow exponentially – especially with the advent of social content. Approximately 70% of data is unstructured. This impacts on storage costs and management, Data Protection, and SLAs.
New deployment options such as cloud provide alternatives but how do you know what you should move to the cloud?
A presentation given as part of the DC101 training course run by the DCC at Oxford University in June 2010. The course provided data management guidance for researchers.
This document provides an update on the W3C Dataset eXchange Working Group (DXWG). It discusses some issues with the existing Data Catalog Vocabulary (DCAT) specification and how various application profiles have extended DCAT to address these issues. It outlines the mission and deliverables of DXWG, which include revising the DCAT recommendation, providing guidance on publishing application profiles, and explaining how to implement content negotiation by application profile. The document also discusses use cases and requirements considered by DXWG and how "minimal ontological commitment" is guiding the reworking of DCAT. It presents early ideas for how to describe application profiles and provides links to engage with the ongoing work of DXWG.
We presented these slides at the NIH Data Commons kickoff meeting, showing some of the technologies that we propose to integrate in our "full stack" pilot.
Team Argon proposes a commons platform using reusable components to promote continuous FAIRness of data. These components include Globus Connect Server for standardized data access and transfer across storage systems, Globus Auth for authentication and authorization, and BDBags for exchange of query results and cohorts using a common manifest format. Together these aim to provide uniform, secure, and reliable access, transfer, and sharing of data while supporting identification, search, and virtualization of derived data products.
This document discusses data accessibility and challenges. It covers the data life cycle, including planning, generating data, reliability, ownership, metadata, versioning, and publishing. It discusses expectations for accessing and sharing data. Open access data policies are encouraged by research funders, journals, and initiatives like DataCite to assign identifiers to research data. Data can be shared through repositories, journals, websites, or informally between researchers. Factors that affect sharing and accessing data include size, computing needs, standards, repositories, data nature, governance, and metadata.
Business System Architecture for Publishing IndustryAtul Pawar
This document outlines a publishing systems architecture to integrate product enablement processes. It proposes integrating around content management and workflow, preferring third-party components and building capabilities in a product and content-neutral manner. The strategy focuses on managing content across the product lifecycle through acquisition, creation, enrichment, transformation, and delivery, with an emphasis on collaboration, transparency and reducing silos.
Moving mountains with Sharepoint - Document Management with SharePoint 2013Oliver Wirkus
This presentation shows how to implement a Document Management System to an existing SharePoint Intranet, Best practices on how to start that kind of projects and on Document Management with SharePoint
Ty Molchany - Information Remediation After Mergers & Acquisitions: An Auto-C...ARMA International
A large global Pharmaceutical company expanded its product portfolio through mergers and acquisitions. However, this resulted in the company acquiring huge amounts of regulatory data and information that did not have proper classification, metadata, or retention periods. This Case Study is a real world example of the approach taken by the organization to streamline how it identified, analyzed, classified, remediated, and properly disposed of information.
Expressing Concept Schemes & Competency Frameworks in CTDLCredential Engine
This presentation is focused on how the Credential Engine can access 3rd party resource data stores and recipes for mapping and publishing competency frameworks as Linked Data.
This document provides an overview of a workshop on good practice in research data management held at the University of Tartu, Estonia. The workshop covered various topics including defining research data, research data management and data management plans, organizing and documenting data, file formats and storage, metadata, security, and sharing and preserving data. The workshop was led by Stuart Macdonald from the University of Edinburgh and included presentations, introductions, and discussions around each of these research data management topics.
Clinical data transparency - EMA Policy 0070Qdossier B.V.
This document provides an overview of EMA Policy 0070 on the publication of clinical data for medicinal products. It discusses the redaction of personal protected data and commercially confidential information from clinical reports. It also outlines the technical requirements and submissions process for proposed and final redaction packages.
What is Lean Authoring?
Why Lean Authoring?
Advantages of Lean Authoring
How to apply Lean Authoring
Lean Authoring General
Lean Authoring Module 3
Real life examples
After 10 years of eCTD the standard is fit for rejuvenation. Although eCTD has brought many benefits there is still significant room for improvement. This presentation covers different areas: continuous challenges associated with eCTD, Regulatory guidelines and eCTD viewing
ISO IDMP: Practical considerations from XEVMPD experienceQdossier B.V.
ISO IDMP (Identification of Medicinal Products) is coming! What lessons can we learn from our practical exprience with XEVMPD in preparation for IDMP? Topics include data cleaning, managing inconsistencies across product registrations and countries and controlled vocabularies
Practical XEVMPD experience; once upon a time there was a perfectly clean dat...Qdossier B.V.
This document discusses challenges with inconsistent drug product data across different databases, regions, and disciplines. It notes inconsistencies in substance naming, active ingredients listed as excipients, and differences in listed excipients between countries for the same product. Maintaining consistent data in the XEVMPD format is challenging due to issues like using controlled vocabularies versus SmPC data and different MA numbers used across countries. Lessons learned include that inconsistencies across systems and SmPCs become visible when consolidating data, and defining consistent metadata elements and values across disciplines is important to achieve consistent records.
Building a link between ectd and xevmpdQdossier B.V.
Describing the role of XEVMPD/ IDMP in context of the regulatory environment and explaining the link with information managed within the eCTD and under GMP.
Discover timeless style with the 2022 Vintage Roman Numerals Men's Ring. Crafted from premium stainless steel, this 6mm wide ring embodies elegance and durability. Perfect as a gift, it seamlessly blends classic Roman numeral detailing with modern sophistication, making it an ideal accessory for any occasion.
https://rb.gy/usj1a2
3 Simple Steps To Buy Verified Payoneer Account In 2024SEOSMMEARTH
Buy Verified Payoneer Account: Quick and Secure Way to Receive Payments
Buy Verified Payoneer Account With 100% secure documents, [ USA, UK, CA ]. Are you looking for a reliable and safe way to receive payments online? Then you need buy verified Payoneer account ! Payoneer is a global payment platform that allows businesses and individuals to send and receive money in over 200 countries.
If You Want To More Information just Contact Now:
Skype: SEOSMMEARTH
Telegram: @seosmmearth
Gmail: seosmmearth@gmail.com
Part 2 Deep Dive: Navigating the 2024 Slowdownjeffkluth1
Introduction
The global retail industry has weathered numerous storms, with the financial crisis of 2008 serving as a poignant reminder of the sector's resilience and adaptability. However, as we navigate the complex landscape of 2024, retailers face a unique set of challenges that demand innovative strategies and a fundamental shift in mindset. This white paper contrasts the impact of the 2008 recession on the retail sector with the current headwinds retailers are grappling with, while offering a comprehensive roadmap for success in this new paradigm.
The 10 Most Influential Leaders Guiding Corporate Evolution, 2024.pdfthesiliconleaders
In the recent edition, The 10 Most Influential Leaders Guiding Corporate Evolution, 2024, The Silicon Leaders magazine gladly features Dejan Štancer, President of the Global Chamber of Business Leaders (GCBL), along with other leaders.
B2B payments are rapidly changing. Find out the 5 key questions you need to be asking yourself to be sure you are mastering B2B payments today. Learn more at www.BlueSnap.com.
Company Valuation webinar series - Tuesday, 4 June 2024FelixPerez547899
This session provided an update as to the latest valuation data in the UK and then delved into a discussion on the upcoming election and the impacts on valuation. We finished, as always with a Q&A
How to Implement a Real Estate CRM SoftwareSalesTown
To implement a CRM for real estate, set clear goals, choose a CRM with key real estate features, and customize it to your needs. Migrate your data, train your team, and use automation to save time. Monitor performance, ensure data security, and use the CRM to enhance marketing. Regularly check its effectiveness to improve your business.
Top mailing list providers in the USA.pptxJeremyPeirce1
Discover the top mailing list providers in the USA, offering targeted lists, segmentation, and analytics to optimize your marketing campaigns and drive engagement.
Building Your Employer Brand with Social MediaLuanWise
Presented at The Global HR Summit, 6th June 2024
In this keynote, Luan Wise will provide invaluable insights to elevate your employer brand on social media platforms including LinkedIn, Facebook, Instagram, X (formerly Twitter) and TikTok. You'll learn how compelling content can authentically showcase your company culture, values, and employee experiences to support your talent acquisition and retention objectives. Additionally, you'll understand the power of employee advocacy to amplify reach and engagement – helping to position your organization as an employer of choice in today's competitive talent landscape.
An introduction to the cryptocurrency investment platform Binance Savings.Any kyc Account
Learn how to use Binance Savings to expand your bitcoin holdings. Discover how to maximize your earnings on one of the most reliable cryptocurrency exchange platforms, as well as how to earn interest on your cryptocurrency holdings and the various savings choices available.
An introduction to the cryptocurrency investment platform Binance Savings.
How document templates can facilitate or prohibit multipurpose documents
1. How Document Templates can
Facilitate or Prohibit
Multipurpose of Documents
Michiel Stam – Manager Regulatory Operations
Qdossier B.V.
2. Agenda
Multipurpose & Templates
Content and Context of use
Metadata and subject definition
Facilitating multipurpose documents
Template granularity, naming and numbering
Submission content plan
Harmonised naming
Control of templates
Location of contents
6/24/2013 www.qdossier.com - proprietary 2
3. Multipurpose
Avoid duplication of work
Single authoritative source
Know “where used”
Product life cycle management
Regulatory compliance
6/24/2013 www.qdossier.com - proprietary 3
Reuse of document contents in a different
context across products, regions and
formats
4. About document templates
Capture predefined contents
Define styles, headers and headings
Provides guidance and example text
Captures descriptive information about contents
Part of system for organizing documents by subject
Focus on word-processed format (electronic media)
6/24/2013 www.qdossier.com - proprietary 4
5. Content and Context – Carrot and potato recipes
6/24/2013 www.qdossier.com - proprietary 5
Manufacturers
Competent
authorities
Production
process
(active)
substances
Drug
Products
6. Content and Context – Every item documented
6/24/2013 www.qdossier.com - proprietary 6
Manufacturers
Competent
authorities
Production
process
(active)
substances
7. Content and Context – Cross references
6/24/2013 www.qdossier.com - proprietary 7
Manufacturers
Competent
authorities
Production
process
(active)
substances
8. Content and Context – Branding
6/24/2013 www.qdossier.com - proprietary 8
Manufacturers
Competent
authorities
Production
process
(active)
substances
9. Context of use
Content carrier (e.g.
document)
Location
of
document
in dossier
or DMS
Cross references to
other documents
Branding
within a
content
carrier
6/24/2013 www.qdossier.com - proprietary 9
Context
of use
10. Template metadata selection
No out-of-the box system available
Roles and responsibilities
Storage, retrieval and permission control
Usage across tools
To describe content only (subject)
Contextual metadata in eDMS doc properties only
6/24/2013 www.qdossier.com - proprietary 10
11. Definition of subject
Discrete piece of content
Specific topic
Identifiable purpose
Stand-alone
Reuseable in various context
6/24/2013 www.qdossier.com - proprietary 11
12. Facilitating multipurpose documents (I)
Templates:
Well defined template granularity
Strong document naming
Generic cross-references (exchangeable destination)
Standardized document contents
Guidance, training and standardization:
Metadata
Lean authoring
Versioning
Supported by submission content plan:
Document naming
Outline and granularity
Metadata and context of use
Relations between documents in context of use
6/24/2013 www.qdossier.com - proprietary 12
13. Facilitate multipurpose documents (II)
Supported by document management process
Store and locate templates and documents
Capture and define document naming and properties
Version control of templates and documents
Manage document locations
Maintain relationships between documents
Supported by training and monitoring
6/24/2013 www.qdossier.com - proprietary 13
14. Template granularity
Separation of reusable and submission-specific contents
Multiple granularity options:
Single or multiple docs per CTD section
Specific or non-specific to the <subject> (e.g. drug product)
P.1 Description <container name>
P.7 <container name>
P.7 <free text> <container name>
P.7 <free text>
6/24/2013 www.qdossier.com - proprietary
Option for single document, specific to the container
Option for multiple documents, specific and non-specific to the container
(preferred)
15. Document numbering
Exclude (sub)section numbering from eCTD names
Facilitate reuse across CTD module 3 and Asean CTD part II
Omit redundant numbering in eCTD
Consider cross-references!
6/24/2013 www.qdossier.com - proprietary 15
16. Document naming
Strong naming
Minimize contextual information
Specific enough to specify contents (future proof!)
Standardized naming variables
According to internal and external conventions
Harmonized naming
Across eDMS
Content plans
Document templates
Published output (e.g. electronic submissions or SharePoint)
Multiple document name types
eDMS name
(e)CTD name
Output file name
6/24/2013 www.qdossier.com - proprietary 16
17. Multipurpose naming
Container part fit for multiple container closure
systems, across multiple products and regions:
3.2.P.7 Cap 3ml Vial Qdrug
3.2.P.7 Cap 3ml Vial Qdrug
P.7 Cap 3ml Vial Qdrug
P.7 Cap 3ml Vial
P.7 Cap Vial
P.7 Vial Cap 13mm
6/24/2013 www.qdossier.com - proprietary 17
20. Control of templates and documents
How to manage template versions?
Existing document is current, finalized and approved
Keep track of changes
Ownership of multipurpose templates
• Authorization and approval of draft documents
Review procedure not limited to contents only
Contents in context of multipurpose
Metadata
6/24/2013 www.qdossier.com - proprietary 20
21. Facilitating Lean Authoring
Trained Authors
Contents according to defined location
Avoid contextual information
Proper use of Word features
Cross references, headers, headings, captions etc.
Template guiding information and examples
Authoring style
Granularity
Header information
Document naming
Cross-reference style
Contents expected
6/24/2013 www.qdossier.com - proprietary 21
22. Location of contents (I)
Defined by template, however..
Stability data intermediates appears in:
Intermediates section
Stability section
Art 46 Statement appears as:
- Addendum to the Cover Letter?
- Addendum to M.2.5 Clinical Overview?
6/24/2013 www.qdossier.com - proprietary 22
23. Location of contents (II)
Batch numbering in 3.2.P.3.3 and not together with batch
analyses in 3.2.P.5.4
Stability conclusions only in 3.2.S.7.1 and 3.2.P.8.1 (not
in other stability sections)
Process validation on intermediates:
Included in S.2.4 Intermediates?
Or in S.2.5 Process validation?
Or S.2.6 Manufacturing process development?
6/24/2013 www.qdossier.com - proprietary 23
24. Templates as part of RIM process
6/24/2013 www.qdossier.com - proprietary 24
Global tracking
system
Submission content
plan
Search eDMS docs
and templates
CTD document
template
Document
authoring
Document
storage
Electronic
submission
s
Health
Authority
Sharepoint
Cloning
Archive eCTD
viewer