What is Lleverage?

Welcome to the Lleverage Docs. This site includes an introduction to Lleverage concepts, resources for using the API and SDK and best practices for building and deploying AI workflows.

What is Lleverage?

Lleverage is an AI development platform designed to simplify building and deploying AI features. It helps product and engineering teams collaborate to create production-ready AI functionality with ease. By using Lleverage, you can design, test, and manage complex AI workflows that integrate seamlessly into your applications.

Our platform supports a wide range of AI tasks, writing production ready prompts to integrating your own gaurdrails and business logic in a workflow. Whether you're working with generative models or handling complex data operations, Lleverage provides the tools to streamline your AI development process.

Lleverage is simple and straightforward, but it's good to be aware of the following constructs so you can lleverage the full feature set with ease.

Key Concepts

  • Organisations are the top level containers that house all your Lleverage projects and have top-level connections.

  • Projectsare the containers where all your work happens. They live within organisations and help you manage your workflows, prompts, vector stores and more. You can invite users to collaborate, and each project can have its own connections and secrets.

  • Workflows allow you to build and collaborate on powerful AI workflows that chain business logic, data, APIs and prompts to create full fledged AI functionality. Once published, every workflow becomes an API end point that you can call form within your code base.

  • Prompts are used to construct and optimise calls to generative models. You have standard access to the industries leading models and are supported by AI helpers to craft a great prompt and select the right model for the job.

  • Vector Stores allow you to upload documents and data and bring them into vector space. Vector stores can be called from within workflows to perform similarity search.

  • Connections give you access to data and/or third parties to build AI workflows. There are three types of connections:

    • Model providers to call generative services like LLMs, text-to-image or text-to-speech models.

    • Databases to fetch and store data from your own relational or vector databases.

  • Insights allow you to track how your workflows and prompts run in production and to explore ways to optimise on cost, latency or quality. This feature is currently in private alpha.

If you're getting started, your organisation will contain a Default Project and have LLM keys provided to you by Lleverage. The next thing to do to get started is start your writing your first Prompts or build your first Workflows.


Last updated