Convert natural language to structured data using LLM intelligence
The Parameter Extractor node converts unstructured text into structured data using LLM intelligence. It bridges the gap between natural language input and the structured parameters that tools, APIs, and other workflow nodes require.
Select the Input Variable containing the text you want to extract parameters from. This typically comes from user input, LLM responses, or other workflow nodes.Choose a Model with strong structured output capabilities. The Parameter Extractor relies on the LLM’s ability to understand context and generate structured JSON responses.
Write clear instructions describing what information to extract and how to format it. Providing examples in your instructions improves extraction accuracy and consistency for complex parameters.
Choose between two extraction approaches based on your model’s capabilities:Function Call/Tool Call uses the model’s structured output features for reliable parameter extraction with strong type compliance.Prompt-based relies on pure prompting for models that may not support function calling or when prompt-based extraction performs better.
Enable memory to include conversation history when extracting parameters. This helps the LLM understand context in interactive dialogues and improves extraction accuracy for conversational workflows.
The node provides both extracted parameters and built-in status variables:Extracted Parameters appear as individual variables matching your parameter definitions, ready for use in downstream nodes.Built-in Variables include status information:
__is_success - Extraction success status (1 for success, 0 for failure)
__reason - Error description when extraction fails