Search for Articles:

Generative AI Policies

PSR-Press Policy on the Use of Generative AI and AI-Assisted Technologies

This policy applies to all journals published by Ptolemy Scientific Research Press (PSR-Press). It has been developed in response to the rapid growth of generative AI and AI-assisted technologies (“AI tools”) in research and scholarly communication. Our aim is to provide clear guidance to authors, reviewers, editors and readers, and to promote transparent, responsible and ethical use of these tools.

PSR-Press will continue to review and update this policy as practices, tools and community standards evolve.

1. General Principles

PSR-Press recognizes that AI tools can support researchers by helping them explore literature, organize content, improve readability, and generate or refine ideas. However:

  • AI tools must not replace the researcher’s own critical thinking, expertise, and judgment.

  • Human authors remain fully responsible and accountable for all content submitted to PSR-Press journals.

  • Use of AI tools must be transparent, properly documented, and consistent with ethical, legal, and professional standards.

2. For Authors

2.1 Use of AI Tools in Manuscript Preparation

Authors may use generative AI and AI-assisted technologies while preparing their manuscripts, for example to:

  • Structure and organize text or outline sections

  • Improve clarity, grammar and readability

  • Summarize or synthesize already-read literature

  • Generate ideas or alternative formulations for text they then critically review and revise

However, AI tools must not be used to:

  • Generate entire manuscripts with minimal human input

  • Produce unverified factual statements, data, or references

  • Replace the author’s own analysis, interpretation, or argumentation

At all times, authors must retain full intellectual control over the manuscript and ensure that the final work reflects their own original contribution.

2.2 Author Responsibilities

By submitting to a PSR-Press journal, authors accept full responsibility for:

  1. Accuracy and integrity

    • Carefully checking any AI-generated or AI-assisted output for factual accuracy, completeness, and impartiality.

    • Verifying all references and citations; fabricated, incorrect or incomplete references are not acceptable.

  2. Original contribution

    • Substantially revising and adapting any AI-suggested text so that the manuscript expresses the authors’ own reasoning, interpretation and conclusions.

    • Ensuring that the work is not plagiarized, self-plagiarized, or in any way misrepresents the origin of ideas or text.

  3. Legal and ethical compliance

    • Respecting data protection, confidentiality, intellectual property rights, and any contractual or institutional obligations.

    • Ensuring that use of AI tools does not violate applicable regulations or ethical standards, particularly where personal or sensitive data are involved.

  4. Terms of use of AI tools

    • Reviewing the terms and conditions of any AI tool used to confirm that:

      • Confidential or unpublished material (including manuscripts and underlying data) is not used to train the model or shared beyond what is necessary to provide the service.

      • No rights are granted to the AI provider that could conflict with later journal publication (e.g., exclusive licenses or restrictions on reuse of outputs).

    • Avoiding tools whose terms could compromise the confidentiality of the work or the authors’ rights.

2.3 Responsible Use, Privacy and Confidentiality

Authors must exercise particular care when using AI tools with any of the following:

  • Unpublished manuscripts, internal reports, or confidential data

  • Personally identifiable information or sensitive personal data

  • Proprietary or third-party materials (e.g., images, code, datasets)

Such material should not be uploaded to AI tools unless the authors are certain that:

  • The tool’s terms safeguard confidentiality and data security.

  • There is no transfer of rights that would limit publication or infringe third-party rights.

Authors are responsible for ensuring that they have the right to use both the inputs and outputs of AI tools in their submitted work.

2.4 Disclosure of AI Use

The use of AI tools in manuscript preparation must be openly disclosed.

  • Authors must include a brief AI declaration in their manuscript (e.g., in a dedicated “AI Use Statement” or “Author’s Note”) at submission.

  • This statement should specify:

    • The name and, where relevant, version of the AI tool(s) used

    • The purpose of use (e.g., language editing, outline generation, code assistance, text summarization)

    • The extent of human oversight (e.g., “All AI-suggested text was carefully reviewed and extensively revised by the authors”)

Routine proofreading for spelling, grammar, and punctuation using standard tools (e.g., spell-checkers, grammar checkers) does not require a formal AI declaration.

If AI tools are used as part of the research process itself (e.g., in data analysis, text mining, model development), this must be described in detail in the Methods or equivalent section, including sufficient information for reproducibility.

2.5 Authorship

AI tools cannot be credited as authors or co-authors.

  • Authorship requires responsibilities that only humans can fulfill, including:

    • Designing or substantially contributing to the work

    • Interpreting results and drawing conclusions

    • Approving the final version and accepting responsibility for the work’s integrity

  • Authors must not list AI tools (e.g. “ChatGPT”, “Claude”, etc.) as authors, nor cite them as if they were human contributors.

Each listed author is responsible for:

  • Ensuring the work is original and not previously published (unless explicitly allowed as preprint or similar, according to journal policy)

  • Confirming the accuracy and integrity of all parts of the manuscript

  • Addressing any post-publication concerns about errors or misconduct

Authors are expected to comply with PSR-Press editorial and ethics policies, including those concerning plagiarism, duplicate publication, and authorship criteria.

3. Figures, Images, and Artwork

3.1 Scientific Figures and Images

To protect the integrity and reproducibility of research, PSR-Press journals do not allow the use of generative AI or AI-assisted tools to create or manipulate scientific images and figures in submitted manuscripts, including but not limited to:

  • Microscopy, radiology, pathology, and other biomedical images

  • Photographs, gels, blots, or any image that represents experimental or observational data

  • Graphs or plots derived from experimental datasets, when AI is used to alter the appearance of the data beyond standard visualization tools

Prohibited uses include:

  • Adding, removing, or altering features or objects in an image

  • Compositing images without clear indication and justification

  • “Enhancing” images in a way that changes or obscures original information

Permitted adjustments include:

  • Global changes to brightness, contrast, or color balance that do not hide, remove, or distort any underlying information

  • Standard cropping for clarity, provided this is transparently described when relevant

PSR-Press may use image-forensic or other tools to examine submitted figures for irregularities.

3.2 AI Tools as Part of Research Methods

The only exception to the above prohibition is when the use of AI or AI-assisted tools is integral to the research itself (for example, AI-based image reconstruction, segmentation, or analysis in biomedical imaging or computer vision).

In such cases:

  • The methods must be described in a reproducible way in the Methods section, including:

    • Name and version of the AI model or software

    • Parameters and training data used, where applicable

    • Detailed description of how the AI tool was applied

  • Authors should be prepared to provide original, unprocessed data or pre-AI images if requested by editors or reviewers.

3.3 Graphical Abstracts and Artistic Artwork

  • Generative AI tools must not be used to create graphical abstracts or illustrative artwork that could be mistaken for genuine research data or mislead readers about the nature or origin of the content.

  • Use of AI-generated artwork or cover designs may be considered for non-data-bearing material (e.g., journal covers, promotional images) only if:

    • Prior approval is obtained from the journal’s editorial office or publisher;

    • Authors confirm that they hold all necessary rights to use and publish the artwork;

    • Appropriate attribution or acknowledgements are provided, in line with journal practice.

4. For Reviewers

4.1 Confidentiality and AI Tools

Manuscripts sent for peer review are confidential documents. Reviewers must not upload:

  • The manuscript (in whole or in part)

  • Supplementary files

  • Their review report or any part of it

into generative AI systems or other AI tools that are not explicitly authorized and controlled by PSR-Press, as this may:

  • Breach author confidentiality and proprietary rights

  • Violate data protection or privacy regulations, especially if personal or sensitive data are involved

  • Transfer rights or data to third parties without the authors’ or publisher’s consent

The confidentiality of the peer review process extends to all materials and communications associated with the review.

4.2 Non-delegation of Peer Review

Peer review requires expert judgment, critical assessment, and ethical responsibility that cannot be delegated to AI systems.

  • Reviewers must not rely on AI tools to evaluate the scientific quality, novelty, or significance of a manuscript.

  • AI-generated summaries, critiques, or recommendations must not be used as substitutes for the reviewer’s own analysis.

  • Reviewers are personally responsible for the content, fairness, and integrity of their review reports.

If a reviewer wishes to use language support tools (e.g., grammar correction, clarity suggestions) for their own review text, they must ensure that no confidential manuscript content is shared with third-party tools.

5. Use of AI Tools by PSR-Press

PSR-Press may use AI-assisted systems to support editorial processes, for example:

  • Initial screening of manuscripts for completeness, plagiarism, or ethical issues

  • Identifying potential reviewers based on subject area and expertise

  • Assisting in workload management and workflow optimization

Any such tools will:

  • Operate under strict privacy, data protection and security standards

  • Not be used to make final editorial decisions without human oversight

  • Be regularly evaluated for bias, reliability, and compliance with relevant regulations and ethical principles

6. Policy Updates and Contact

This policy will be updated as AI technologies and community norms evolve. Authors, reviewers and editors are encouraged to:

  • Consult the latest version of this policy on the PSR-Press website before submission or review

  • Contact the relevant journal’s editorial office if they are unsure how this policy applies to a specific situation

PSR-Press is committed to promoting responsible, transparent, and human-centered use of AI tools in scholarly publishing, while preserving the integrity and trustworthiness of the scientific record.