You are here: Home

Modified items

All recently modified items, latest first.
RPMPackage python3-langfuse-2.53.9-1.lbn36.noarch
Langfuse Python SDK
RPMPackage python3-langflow+postgresql-1.4.0-1.lbn36.noarch
This is a metapackage bringing in postgresql extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+nv-ingest-1.4.0-1.lbn36.noarch
This is a metapackage bringing in nv-ingest extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+local-1.4.0-1.lbn36.noarch
This is a metapackage bringing in local extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+deploy-1.4.0-1.lbn36.noarch
This is a metapackage bringing in deploy extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+couchbase-1.4.0-1.lbn36.noarch
This is a metapackage bringing in couchbase extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+clickhouse-connect-1.4.0-1.lbn36.noarch
This is a metapackage bringing in clickhouse-connect extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+cassio-1.4.0-1.lbn36.noarch
This is a metapackage bringing in cassio extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow-1.4.0-1.lbn36.noarch
Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database. ✨ Core features Python-based and agnostic to models, APIs, data sources, or databases. Visual IDE for drag-and-drop building and testing of workflows. Playground to immediately test and iterate workflows with step-by-step control. Multi-agent orchestration and conversation management and retrieval. Free cloud service to get started in minutes with no setup. Publish as an API or export as a Python application. Observability with LangSmith, LangFuse, or LangWatch integration. Enterprise-grade security and scalability with free DataStax Langflow cloud service. Customize workflows or create flows entirely just using Python. Ecosystem integrations as reusable components for any model, API or database.
RPMPackage python3-langchainhub-0.1.15-1.lbn36.noarch
The LangChain Hub API client
RPMPackage python3-langchain-xai-0.2.3-1.lbn36.noarch
langchain-xai This package contains the LangChain integrations for xAI through their APIs. Installation and Setup Install the LangChain partner package pip install -U langchain-xai Get your xAI api key from the xAI Dashboard and set it as an environment variable (XAI_API_KEY) Chat Completions This package contains the ChatXAI class, which is the recommended way to interface with xAI chat models.
RPMPackage python3-langchain-unstructured-0.1.5-1.lbn36.noarch
langchain-unstructured This package contains the LangChain integration with Unstructured Installation pip install -U langchain-unstructured And you should configure credentials by setting the following environment variables: export UNSTRUCTURED_API_KEY="your-api-key" Loaders Partition and load files using either the unstructured-client sdk and the Unstructured API or locally using the unstructured library. API: To partition via the Unstructured API pip install unstructured-client and set partition_via_api=True and define api_key. If you are running the unstructured API locally, you can change the API rule by defining url when you initialize the loader. The hosted Unstructured API requires an API key. See the links below to learn more about our API offerings and get an API key. Local: By default the file loader uses the Unstructured partition function and will automatically detect the file type. In addition to document specific partition parameters, Unstructured has a rich set of "chu
RPMPackage python3-langchain-together-0.3.0-1.lbn36.noarch
langchain-together This package contains the LangChain integrations for Together AI through their APIs. Installation and Setup Install the LangChain partner package pip install -U langchain-together Get your Together AI api key from the Together Dashboard and set it as an environment variable (TOGETHER_API_KEY) Chat Completions This package contains the ChatTogether class, which is the recommended way to interface with Together AI chat models. ADD USAGE EXAMPLE HERE. Can we add this in the langchain docs? NEED to add image endpoint + completions endpoint as well Embeddings See a usage example Use togethercomputer/m2-bert-80M-8k-retrieval as the default model for embeddings.
RPMPackage python3-langchain-text-splitters-0.3.8-1.lbn36.noarch
🦜✂️ LangChain Text Splitters Quick Install pip install langchain-text-splitters What is it? LangChain Text Splitters contains utilities for splitting into chunks a wide variety of text documents. For full documentation see the API reference and the Text Splitters module in the main docs. 📕 Releases & Versioning langchain-text-splitters is currently on version 0.0.x. Minor version increases will occur for: Breaking changes for any public interfaces NOT marked beta Patch version increases will occur for: Bug fixes New features Any changes to private interfaces Any changes to beta features 💁 Contributing As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation. For detailed information on how to contribute, see the Contributing Guide.
RPMPackage python3-langchain-tests-0.3.19-1.lbn36.noarch
langchain-tests This is a testing library for LangChain integrations. It contains the base classes for a standard set of tests. Installation We encourage pinning your version to a specific version in order to avoid breaking your CI when we publish new tests. We recommend upgrading to the latest version periodically to make sure you have the latest tests. Not pinning your version will ensure you always have the latest tests, but it may also break your CI if we introduce tests that your integration doesn't pass. Pip: ```bash pip install -U langchain-tests ``` Poetry: ```bash poetry add langchain-tests ``` Usage To add standard tests to an integration package's e.g. ChatModel, you need to create A unit test class that inherits from ChatModelUnitTests An integration test class that inherits from ChatModelIntegrationTests tests/unit_tests/test_standard.py: """Standard LangChain interface tests""" from typing import Type import pytest from langchain_core.language_models import BaseChat
RPMPackage python3-langchain-sambanova-0.1.0-1.lbn36.noarch
langchain-sambanova
RPMPackage python3-langchain-pinecone-0.2.2-1.lbn36.noarch
langchain-pinecone This package contains the LangChain integration with Pinecone. Installation pip install -U langchain-pinecone And you should configure credentials by setting the following environment variables: PINECONE_API_KEY PINECONE_INDEX_NAME Usage The PineconeVectorStore class exposes the connection to the Pinecone vector store. from langchain_pinecone import PineconeVectorStore embeddings = ... # use a LangChain Embeddings class vectorstore = PineconeVectorStore(embeddings=embeddings)
RPMPackage python3-langchain-perplexity-0.1.1-1.lbn36.noarch
langchain-perplexity
RPMPackage python3-langchain-openai-0.3.16-1.lbn36.noarch
langchain-openai This package contains the LangChain integrations for OpenAI through their openai SDK. Installation and Setup Install the LangChain partner package pip install langchain-openai Get an OpenAI api key and set it as an environment variable (OPENAI_API_KEY) LLM See a usage example. from langchain_openai import OpenAI If you are using a model hosted on Azure, you should use different wrapper for that: from langchain_openai import AzureOpenAI For a more detailed walkthrough of the Azure wrapper, see here Chat model from langchain_openai import ChatOpenAI from langchain_openai import AzureChatOpenAI Text Embedding Model See a usage example from langchain_openai import OpenAIEmbeddings from langchain_openai import AzureOpenAIEmbeddings
RPMPackage python3-langchain-ollama-0.3.2-1.lbn36.noarch
langchain-ollama This package contains the LangChain integration with Ollama Installation pip install -U langchain-ollama You will also need to run the Ollama server locally. You can download it here. Chat Models ChatOllama class exposes chat models from Ollama. from langchain_ollama import ChatOllama llm = ChatOllama(model="llama3-groq-tool-use") llm.invoke("Sing a ballad of LangChain.") Embeddings OllamaEmbeddings class exposes embeddings from Ollama. from langchain_ollama import OllamaEmbeddings embeddings = OllamaEmbeddings(model="llama3") embeddings.embed_query("What is the meaning of life?") LLMs OllamaLLM class exposes LLMs from Ollama. from langchain_ollama import OllamaLLM llm = OllamaLLM(model="llama3") llm.invoke("The meaning of life is")