Home News Use of Generative Models for Text and Image Creation in the DFG’s Funding Activities

Use of Generative Models for Text and Image Creation in the DFG’s Funding Activities

The DFG has formulated initial guidelines for dealing with generative models for text and image creation. Among other things, this results in concrete guidelines for applicants in the DFG's funding activities.

The Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) has formulated initial guidelines for dealing with generative models for text and image creation. In a statement, the influence of ChatGPT and other generative AI models on research processes and the DFG’s funding activities is examined. As a starting point for continuous monitoring, the paper provides initial guidance for researchers in their work as well as for applicants to the DFG and those involved in the review, evaluation and decision-making process. In the DFG’s view, AI technologies are already changing the entire work process in science and the humanities, knowledge production and creativity to a considerable degree and are being used in a variety of different ways in the various research disciplines. In terms of generative models for text and image creation, this development is only just beginning. In an ongoing process, the DFG will analyse and assess the opportunities and potential risks of using generative models in science ans the humanities and in its own funding activities.

In the context of proposal submission to the DFG, the use of generative models is generally permissible in view of the considerable opportunities and development potential. In terms of content, full responsibility for adherence to research integrity remains with the applicant. However, the use of generative models in the context of the research process requires certain binding framework conditions to ensure good research practice and the quality of its results. The standards of good research practice generally established in science and the humanities are also fundamental here.

Methodological transparency and traceability are essential basic principles of research integrity that also apply to funding proposals to the DFG. The use of generative models must therefore be disclosed in a scientifically appropriate manner, taking into account the specifics of the respective subject area. This is explicitly stated in the proposal guidelines. “Disclosure” is to be understood as indicating which generative models were used for what purpose and to what extent, for example, in the preparation of the research status, in the development of a scientific method, in the evaluation of data or in the generation of hypotheses. AI used that does not affect the scientific content of the application (e.g. grammar, style, spelling check, translation programmes) does not have to be documented. The content can be described in a few explanatory sentences. It is explicitly not necessary to concretely mark the affected text passages in the proposal.

|