Skip to main content

Meta Llama AI

App Vendor: Meta Llama AI

App Category: AI Models

Connector Version: 2.0.0

API Version: 1.0.0

About App

This connector integrates with the Llama AI API, enabling seamless interaction with advanced language models.

The Meta Llama AI app is configured with Cyware Orchestrate to perform the following actions:

Action Name

Description

Generic Action

This is a generic action used to make requests to any Meta Llama AI endpoint.

Get Prompt Response

This action retrieves the response generated by the specified model based on the given prompt.

Get Raw Response

This action retrieves the unformatted (raw) response generated by the selected model for a given prompt.

Configuration Parameters

The following configuration parameters are required for the Meta Llama AI app to communicate with the Meta Llama AI enterprise application. The parameters can be configured by creating instances in the app.

Parameter

Description

Field Type

Required/Optional

Comments

API Token

Enter the API token to authenticate to Llama AI with.

Password

Required

Temperature

Enter the sampling temperature (a number between 0 and 2). Higher values (e.g., 0.8) make the output more random, while lower values (e.g., 0.2) make it more focused and deterministic.

Float

Optional

Default value:

1

Max Tokens

Specify the maximum number of tokens to generate before stopping. Each model has different maximum values for this parameter.

Integer

Optional

Default value: 500

Base URL

Enter the base URL to access the Meta Llama API.

Example:

https://api.llama-api.com

Text

Optional

Timeout

Enter the timeout value (in seconds) for the API request.

Integer

Optional

Allowed range:

15-120

Default value:

15

Verify

Choose your preference to either verify or skip ssl certificate.

Boolean

Optional

Allowed values are true and false.

By default, verification is enabled.

Action: Generic Action

This is a generic action used to make requests to any Meta Llama AI endpoint.

Action Input Parameters

Parameter

Description

Field Type

Required/Optional

Comments

Method

Enter the HTTP method to make the request.

Text

Required

Allowed values:

GET, PUT, POST, and DELETE.

Endpoint

Enter the endpoint to make the request to.

Example:

responses

Text

Required

Query Params

Enter the query parameters to pass to the API.

Key Value

Optional

Payload

Enter the payload to pass to the API.

Any

Optional

Extra Fields

Enter the extra fields to pass to the API.

Key Value

Optional

Allowed keys:

payload_data, payload_json, download, files, filename, retry_wait, retry_count, custom_output, and response_type

Action: Get Prompt Response

This action retrieves the response generated by the specified model based on the given prompt.

Action Input Parameters

Parameter

Description

Field Type

Required/Optional

Comments

Model

Specify the model name to use for generating the response.

Example:

llama3.1-70b.

Single-select

Required

Prompt

Provide the text prompt that will be sent to the selected model for response generation.

Text_area

Required

Action: Get Raw Response

This action retrieves the unformatted (raw) response generated by the selected model for a given prompt.

Action Input Parameters

Parameter

Description

Field Type

Required/Optional

Comments

Model

Specify the model name to use for generating the response.

Example:

llama3.1-70b.

Single-select

Required

Prompt

Enter the text prompt to send to the selected model.

Example:

List the top 5 cybersecurity risks for cloud-based environments.

Text_area

Required

Response Format

Enter the format in which you want the response to be returned.

Example:

$json[{'type': 'json_schema', 'json_schema':{'name': 'address', 'schema': {}}}]

Any

Optional

Allowed format: json_schema