High quality Linked Data is an important factor for the success of the Semantic Web. However, the quality of generated Linked Data is typically assessed and refined after the dataset is generated, which is computationally intensive. Given Linked Data is typically generated from (semi-)structured data which highly influences the intrinsic dimensions of the resulting Linked Data quality, I investigate how a generation process can automatically be validated before RDF data is even generated. However, current generation processes are not easily validated: descriptions of the data transformations depend on the use case or are incomplete, and validation approaches would require manual (re-)definition of test cases aimed at the generated dataset. I propose (i) a generic approach to declaratively describe a generation process, and (ii) a validation approach for automatically assessing the quality of the generation process itself. By aligning declarative data and schema transformations, the generation process remains generic and independent of the implementation. The transformations can be automatically validated based on constraint rules that apply to the generated RDF data graph using custom entailment regimes. Preliminary results show the generation process of DBpedia can be described declaratively and (partially) validated.