AI4EOSC Platform launches beta LLM4EOSC API service

AI4EOSC Platform launches beta LLM4EOSC API service

During the EOSC Symposium in Berlin AI4EOSC announced the deployment of beta LLM4EOSC (Large Language Models for the EOSC) API service. This innovative service is now available for a limited number of users in the European Open Science Cloud (EOSC) community to evaluate and provide feedback.

The beta LLM4EOSC API service offers powerful capabilities for natural language processing and understanding, enabling researchers and scientists to enhance their workflows and projects. By leveraging advanced AI technologies, users can perform tasks such as text generation, summarization, question answering, and more, all within the context of the EOSC.

Open call for preview access
To facilitate a thorough evaluation of the LLM4EOSC API service within the EOSC environment, we are launching an open call for preview access. Interested users from the EOSC community are invited to request access on a limited basis. This opportunity will allow users to explore the potential of LLM APIs in their specific scientific domains and provide valuable feedback that will help shape the future development of the service.

Key features of the beta LLM4EOSC API service
Advanced natural language processing: Utilize state-of-the-art LLM capabilities for text understanding and generation.
User-friendly interface: Easy-to-use API interface for quick integration into existing workflows.
Fine-grained access via API tokens: Secure and controlled access to the service through API tokens, allowing users to manage and monitor usage effectively.

What can you do with the LLM4EOSC API Service
The current service offer provides an OpenAI compatible API providing access to state of the art open LLMs for fast lightweight tasks. Secured via personal API tokens you can use this service to build native AI applications like chatbots, assistants or knowledge retrieval from a given set of documents or documentation.

Researchers and scientists interested in previewing the beta LLM API service are encouraged to submit their requests via the following online form. The open call will remain active until 15th of November, or until the limited number of preview slots are filled.