Overview
The Coder Artifact in Gaife allows you to create custom code functions that process inputs and generate structured outputs. These artifacts are used in knowledge agents to handle the computational logic of your automation workflows.Interface Layout
The Coder Artifact interface consists of:- Header Section: Displays the artifact name with edit option
- Action Buttons: Update and Publish buttons
- Code Generation Assistant: AI-powered assistant to help generate code
- Generated Code: Python editor with syntax highlighting
Creating a Coder Artifact
Getting Started
When creating a new Coder Artifact, youβll see the following interface elements:-
Code Generation Assistant:
- Powered by GAIFE AI
- Provides a conversational interface to describe what you want to build
- Options to define input parameters, expected output, and add rules
-
Supporting Files Section:
- Rules (.txt, .json, .yaml)
- Sample data (.csv, .json)
-
Input & Output Configuration:
- Define the structure and format of your input parameters
- Specify the expected output format
Writing Coder Functions
Function Structure
Every Coder Artifact must follow this structure:-
Main Function:
- Must accept an
arguments
parameter that contains all input data - Process the input according to business logic
- Return structured output data
- Must accept an
-
Required Boilerplate Code:
Example Implementation
Hereβs an example of a complete Coder Artifact for invoice calculation:Key Components
Function Parameters
Every Coder Artifact function must accept a single parameter calledarguments
, which is a dictionary containing all
input data passed to the function.
Input Handling
Proper input handling is essential:- Extract values using
arguments.get('key', default_value)
- Include type conversion for input values (e.g.,
float()
,int()
) - Handle potential missing or malformed data
- Implement error handling with try/except blocks
Processing Logic
Implement your business logic in the middle section of your function:- Perform calculations
- Apply business rules
- Format data as needed
Output Structure
Return a structured dictionary that matches your expected output format:- Nest data in appropriate hierarchies
- Format values consistently
- Include all required fields
Boilerplate Code
Always include the required boilerplate code at the end to properly store and return results:π·οΈ Naming Conventions
Following consistent naming conventions ensures readability, maintainability, and consistency across all Gaife automation workflows.Arguments Dictionary Naming Conventions
General Rules
- Parameter Name: Always use
arguments
(lowercase) as the parameter name for your main function - Key Formatting:
- Use snake_case (lowercase with underscores)
- Use descriptive names that clearly indicate purpose
- Avoid abbreviations unless widely recognized
- Keep names concise yet meaningful
Examples of Well-Formatted Argument Keys
β Recommended:customer_id
invoice_date
items_to_be_billed
tax_rate
shipping_address
payment_method
CustomerID
(not snake_case)inv_dt
(unclear abbreviation)i
(not descriptive)calculation amount
(spaces instead of underscores)
Nested Arguments
For nested dictionaries within arguments:Output Dictionary Naming Conventions
General Rules
-
Result Structure: Always use the following structure:
-
Primary Output Keys:
- Use snake_case for all keys (lowercase with underscores)
- Format values consistently based on their type
- For categories or sections, use descriptive names
-
Formatting Values:
- Format numeric values consistently (e.g.,
f'{value:.2f}'
for currency) - Use ISO formats for dates when possible
- Convert all values to appropriate string formats for API compatibility
- Format numeric values consistently (e.g.,
Examples of Well-Formatted Output Keys
β Recommended Output Structure:Data Type Conventions
Numeric Values
- Format currency values with two decimal places:
f'{value:.2f}'
- Use integers for counts:
item_count = int(value)
- Format percentages consistently:
f'{value:.1f}%'
or as decimal values
Date and Time Values
- Use ISO format (YYYY-MM-DD) for date storage:
2023-04-15
- For display formatting, use consistent patterns:
'%d-%m-%Y'
- Include timezone information for time-sensitive operations
Boolean Values
- Use clear boolean names:
is_taxable
,has_discount
,requires_shipping
π Testing and Publishing
Testing Your Code
- Use the βDry Runβ button to test your code with sample inputs
- Check the output to ensure it matches the expected format
- Debug any issues before publishing
Publishing
- Click the βPublishβ button to make your artifact available for use
- Published artifacts can be used by other components in your knowledge agent
π‘ Best Practices
- Error Handling: Always implement robust error handling with try/except blocks
- Input Validation: Validate all input data before processing
- Documentation: Add detailed comments to explain the logic and data flow
- Type Conversion: Convert input data to appropriate types (string, int, float)
- Default Values: Provide sensible default values for optional parameters
- Code Organization: Structure your code logically with clear sections
- Naming Consistency: Use the same name for the same concept throughout your code
- Clarity: Choose names that reveal intent
- Simplicity: Keep names simple but descriptive
- Hierarchy: Structure output dictionaries logically with clear parent-child relationships
π Integration with Workflow Agent
The Coder Artifact is a key component in the Workflow Agent ecosystem:- Workflow Integration: Your published Coder Artifact will be available as a processing step within workflow agents
- Data Flow:
- Workflow agents pass input parameters to your Coder function
- The function processes these parameters according to your logic
- Output parameters are returned to the workflow for further processing or final delivery
- Complex Processing: Workflow agents can chain multiple Coder Artifacts together to handle complex calculations and data transformations
- Automation Pipeline: Acts as the computational engine for automation workflows, handling everything from simple calculations to complex business logic
Use Cases in Workflows
- Financial Calculations: Process invoice amounts, taxes, discounts as shown in the example
- Data Transformation: Convert data between formats or structures
- Decision Logic: Implement complex business rules that determine workflow paths
- API Integration: Pre-process data before sending to external systems
- Analytics: Perform statistical analysis or data aggregation within workflows