Add a new endpoint

Adding a new endpoint to SchemaLink’s API is as easy as adding a new function to the main.py file. HTTP methods, path parameters and query parameters are automatically inferred from the function signature. For more details on how to add a new endpoint, refer to this part of the FastAPI documentation: create a path operation.

Change URL for an endpoint

To change the URL for an endpoint, change the function signature of the target endpoint in the main.py file.

This will most likely be a breaking change to consumers of the API! Make sure all SchemaLink’s webapp instances continue to work by updating their configuration in the docker-compose file.

Update the API docs

The API documentation is automatically generated by FastAPI. For more details on how to configure the API documentation, refer to this part of the FastAPI documentation: OpenAPI.

Work with the linkml library

The API proxies access to some functionalities of the linkml Python library. This includes validating LinkML schemas and generating the equivalent Pydantic version for usage with SPIRES. The library is part of the requirements of the API project. The library offers a number of modules, which implement linters, validators, generators, and more. All of these module can be imported in the main.py file, and used to implement the logic of the endpoints. Extensive details about this library can be found in its documentation.

Work with the OpenAI Python SDK

The API proxies access to some functionalities of the OpenAI Python SDK. This includes invoking the base API, as well as the assistant API. The openai library is part of the requirements of the API project. The first step is always creating an OpenAI client:

client = OpenAI(api_key=OPENAI_API_KEY)

The OPEN_API_KEY should always be stored securely, and set as an environment variable when running the API.

Never hardcode the API key in public repositories, nor expose it in client-side code! For reference, we store all of our sensitive information, such as the OPEN_API_KEY, as ansible-vault variables of our playbook.

Work with OpenAI base API

The client.chat.completions takes care of interacting with OpenAI’s base API. For extensive details on how to use this module, refer to this part of the OpenAI documentation: Completions. Interaction with the base API is implemented in the /api/openai/ask/ endpoint.

Work with OpenAI assistant API

The client.beta.assistants and client.beta.threads modules take care of interacting with OpenAI’s assistant API.

The SchemaLink assistant is manually managed by AnacletoLAB. Because of this, for the time being, there is no need to use the client.beta.assistants module in the API project.

For extensive details on how to use this modules, refer to this part of the OpenAI documentation: Assistants API quickstart. Interaction with the assistant API is implemented in the /api/openai/generate/ endpoint.

Deploy a new version of the API

Deploying a new version is automated by means of the docker GitHub action. This action runs on any new git tag. To deploy a new version, simply create a new tag in your local repository and push it to the remote repository.

git tag -a 1.0.0 -m "1.0.0"
git push --tags

When creating a new tag, consider following semantic versioning.

This action will create a new docker image and push it to AnacletoLAB’s GitHub container registry, from where you can then pull the image and run it anywhere. For this to work, the CR_PAT secret must be set in the repository settings, with a personal access token that has access to the GitHub container registry. For more details, see GitHub docs on:

The personal access token is tied to a GitHub account. If that GitHub account belongs to a contributor, make sure to change this token when the contributor leaves the project.