You are here: Home

Modified items

All recently modified items, latest first.
RPMPackage python3-openapi-spec-validator-0.7.1-1.lbn36.noarch
********************** OpenAPI Spec validator ********************** :target:
RPMPackage python3-openapi-schema-validator-0.6.3-1.lbn36.noarch
About Openapi-schema-validator is a Python library that validates schema against: OpenAPI Schema Specification v3.0 which is an extended subset of the JSON Schema Specification Wright Draft 00. OpenAPI Schema Specification v3.1 which is an extended superset of the JSON Schema Specification Draft 2020-12. Installation Recommended way (via pip): $ pip install openapi-schema-validator Alternatively you can download the code and install from the repository: $ pip install -e git+https:/github.com/p1c2u/openapi-schema-validator.git#egg=openapi_schema_validator Usage By default, the latest OpenAPI schema syntax is expected. To validate an OpenAPI v3.1 schema: from openapi_schema_validator import validate schema = { "type": "object", "required": [ "name" ], "properties": { "name": { "type": "string" }, "age": { "type": ["integer", "null"], "format": "int32", "minimum": 0,
RPMPackage python3-openapi-core-0.18.2-1.lbn36.noarch
About Openapi-core is a Python library that adds client-side and server-side support for the OpenAPI v3.0 and OpenAPI v3.1 specification. Key features Validation and unmarshalling of request and response data (including webhooks) Integration with popular libraries (Requests, Werkzeug) and frameworks (Django, Falcon, Flask, Starlette) Customization with media type deserializers and format unmarshallers Security data providers (API keys, Cookie, Basic and Bearer HTTP authentications) Documentation Check documentation to see more details about the features. All documentation is in the “docs” directory and online at openapi-core.readthedocs.io Installation Recommended way (via pip): pip install openapi-core Alternatively you can download the code and install from the repository: pip install -e git+https:/github.com/python-openapi/openapi-core.git#egg=openapi_core First steps Firstly create your OpenAPI object. from openapi_core import OpenAPI openapi = OpenAPI.from_file_path('op
RPMPackage python3-openapi-client-0.21.2-3.lbn36.noarch
openapi-python-client Generate modern Python clients from OpenAPI 3.0 and 3.1 documents. This generator does not support OpenAPI 2.x FKA Swagger. If you need to use an older document, try upgrading it to version 3 first with one of many available converters. This project is still in development and does not support all OpenAPI features Why This? This tool focuses on creating the best developer experience for Python developers by: Using all the latest and greatest Python features like type annotations and dataclasses. Having documentation and usage instructions specific to this one generator. Being written in Python with Jinja2 templates, making it easier to improve and extend for Python developers. It's also much easier to install and use if you already have Python. Installation I recommend you install with pipx so you don't conflict with any other packages you might have: pipx install openapi-python-client --include-deps. Note the --include-deps option makes ruff available in your
RPMPackage python3-openai-wandb-1.6.1-1.lbn36.noarch
Install support for Weights and Biases
RPMPackage python3-openai-embeddings-1.6.1-1.lbn36.noarch
Install dependencies for embedding utilities.
RPMPackage python3-openai-datalib-1.6.1-1.lbn36.noarch
Data libraries like numpy and pandas are not installed by default due to their size. They’re needed for some functionality of this library, but generally not for talking to the API.
RPMPackage python3-openai+voice_helpers-1.77.0-1.lbn36.noarch
This is a metapackage bringing in voice_helpers extras requires for python3-openai. It makes sure the dependencies are installed.
RPMPackage python3-openai+realtime-1.77.0-1.lbn36.noarch
This is a metapackage bringing in realtime extras requires for python3-openai. It makes sure the dependencies are installed.
RPMPackage python3-ollama-0.4.7-1.lbn36.noarch
Ollama Python Library The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with Ollama. Prerequisites Ollama should be installed and running Pull a model to use with the library: ollama pull <model> e.g. ollama pull llama3.2 See Ollama.com for more information on the models available. Usage from ollama import chat from ollama import ChatResponse response: ChatResponse = chat(model='llama3.2', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?', }, ]) print(response['message']['content']) print(response.message.content)
RPMPackage python3-notebook-shim-0.2.3-1.lbn36.noarch
Notebook Shim This project provides a way for JupyterLab and other frontends to switch to Jupyter Server for their Python Web application backend. Basic Usage Install from PyPI: pip install notebook_shim This will automatically enable the extension in Jupyter Server. Usage This project also includes an API for shimming traits that moved from NotebookApp in to ServerApp in Jupyter Server. This can be used by applications that subclassed NotebookApp to leverage the Python server backend of Jupyter Notebooks. Such extensions should now switch to ExtensionApp API in Jupyter Server and add NotebookConfigShimMixin in their inheritance list to properly handle moved traits. For example, an application class that previously looked like: from notebook.notebookapp import NotebookApp class MyApplication(NotebookApp): should switch to look something like: from jupyter_server.extension.application import ExtensionApp from notebook_shim.shim import NotebookConfigShimMixin class MyApplication(Note
RPMPackage python3-needle-python-0.4.0-1.lbn36.noarch
Needle Python Library This Python library provides convenient acccess to Needle API. There are various methods and data types which, we believe will help you explore Needle API quickly. There may be some functionality available in REST API earlier than this Python library. In any case, we recommend to take look the the complete documentation. Thank you for flying with us. 🚀 Installation This library requires Python >3.8 and pip to use. You don't need the sources unless you want to modify it. Install with: pip install needle-python Usage ⚡️ To get started, generate an API key for your account in developer settings menu at Needle. Note that your key will be valid until you revoke it. Set the following env variable before you run your code: export NEEDLE_API_KEY=<your-api-key> NeedleClient reads the API key from the environment by default. If you like to override this behaviour you can pass it in as a parameter. Retrieve context from Needle from needle.v1 import NeedleClient from need
RPMPackage python3-needle-0.5.0-2.lbn36.noarch
Needle is a tool for testing your CSS and visuals with Selenium and nose. It checks that visuals (CSS/fonts/images/SVG/etc.) render correctly by taking screenshots of portions of a website and comparing them against known good screenshots. It also provides tools for testing calculated CSS values and the position of HTML elements.
RPMPackage python3-nbsphinx-0.8.7-2.fc36.noarch
nbsphinx is a Sphinx extension that provides a source parser for *.ipynb files. Custom Sphinx directives are used to show Jupyter Notebook code cells (and of course their results) in both HTML and LaTeX output. Unevaluated notebooks, i.e. notebooks without stored output cells, will be automatically executed during the Sphinx build process.
RPMPackage python3-mistralclient-4.5.0-1.lbn36.noarch
Python client for Mistral REST API. Includes python library for Mistral API and Command Line Interface (CLI) library.
RPMPackage python3-mistralai-1.7.0-1.lbn36.noarch
Mistral Python Client Mistral AI API: Our Chat Completion and Embeddings APIs specification.
RPMPackage python3-litellm+proxy-1.69.0-1.lbn36.noarch
This is a metapackage bringing in proxy extras requires for python3-litellm. It makes sure the dependencies are installed.
RPMPackage python3-litellm-1.69.0-1.lbn36.noarch
🚅 LiteLLM Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.] LiteLLM manages: Translate inputs to provider's completion, embedding, and image_generation endpoints Consistent output, text responses will always be available at ['choices'][0]['message']['content'] Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - Router Set Budgets & Rate limits per project, api key, model LiteLLM Proxy Server (LLM Gateway)
RPMPackage python3-langwatch-0.1.16-1.lbn36.noarch
LangWatch Python SDK Go to https:/docs.langwatch.ai to get started. Contributing After changing code, to test all integrations are working, run the examples integration test manually (you will need all env vars to be set up): poetry run pytest tests/test_examples.py -p no:warnings -s -x Or to test only a specific example, run: poetry run pytest tests/test_examples.py -p no:warnings -s -x -k <example_name>
RPMPackage python3-language-server-0.36.2-6.fc35.noarch
A Python implementation of the Language Server Protocol.