Lleverage Docs
WebsiteLegalPlatform
  • Welcome
    • What is Lleverage?
    • Organisations
    • Projects
    • Members
  • Workflows
    • Workflows
      • Variables
      • Testing
      • Logs
      • Publishing
      • Version control
      • Workflow Co-Pilot
    • Actions
      • Controls
      • AI
        • Using the Prompt action
        • Prompt engineering
        • Model configuration
        • Agents
      • Tools
      • Integrations
    • Apps
      • Form app
      • Chat app
    • API Endpoints
    • Templates
  • KNOWLEDGE & DATA SETS
    • Knowledge Bases
  • Connections
    • Models
    • Databases
  • Help, Subscription and Other
    • Subscription Management
    • Release Notes
      • Version 0.1 - 15 July 2024
      • Version 0.2 - 6 August 2024
      • Version 0.3 - 23 September 2024
      • Version 0.4 - 7 October 2024
      • Version 0.5 - 28 October 2024
      • Version 0.6 - 6 December 2024
      • Version 0.7 - 14 January 2025
      • Version 0.8 - 11 February 2025
      • Version 0.9 - 21 March 2025
      • Version 1.0 - 8 April 2025
    • Support
Powered by GitBook
On this page
  • How it works
  • Key capabilities
  • Getting started

Was this helpful?

  1. Workflows
  2. Actions

AI

AI actions let you integrate artificial intelligence into your workflows. At their core is the LLM (Large Language Model) action, which can understand text, generate content, analyze data, and help with a wide range of tasks.

How it works

When you add a Prompt action to your workflow, you're creating an interaction with an AI model. You provide instructions (called prompts) and any necessary data, and the model generates appropriate responses. These responses can be anything from simple text to structured data, depending on your needs.

Every Prompt action consists of three main components:

  • The prompt that tells the model what to do

  • The model configuration that controls how it behaves

  • The output settings that determine what you get back

Key capabilities

The Prompt action can handle various tasks:

  • Generate text and content

  • Analyze documents and data

  • Extract information

  • Answer questions

  • Translate languages

  • Help with decision-making

You can enhance these capabilities by engineering prompts to get better results, fine-tuning model settings, structuring output formats, and comparing various prompt and model configurations.

Getting started

To use AI in your workflow:

  1. Add an LLM action where you need AI capabilities

  2. Choose a model appropriate for your task

  3. Write or generate a prompt that describes what you need

  4. Configure how you want the model to respond

  5. Test and refine until you get the desired results

Continue reading:

PreviousControlsNextUsing the Prompt action

Last updated 4 months ago

Was this helpful?

Using the Prompt action →
Explore prompt engineering →
Understand model configuration →