Skip to main content

IQ

Example

Introduction

IQ is NEONNOW's Gen AI assistant service, providing Live Suggestions, Spelling, Prediction, Rewrites, Evaluation, Summarization and more.

The service spans not only the agent desktop, but also services supporting contact flow implementation.

Further features will be added to IQ with each NEONNOW release.

Service Overview

NEONNOW IQ utilizes AWS's Bedrock service to access AWS-hosted Large Language Models (LLMs) to support various Generative AI use cases.

Bedrock runs in your AWS account and you have complete control of your data. All services (for example Bedrock Agents) are deployed via NEONNOW's orchestration layer - you do not need to deploy these manually.

Requirements and LLM Model lifecycle

NEONNOW IQ orchestrates and utilizes AWS Bedrock agents. Model lifecycle is a key consideration of utilising LLMs, and this is handled by NEONNOW.

To ensure continous support, the context of a 'current' model, and 'new' model exists. As new models become available, these will be phased in, and be available for enablement

Generative AI Usage & Pricing

AWS Bedrock pricing is documented here.

Each NEONNOW IQ operation will consume 'tokens' as a standard measure of LLM usage.

As a rough order of magnitude, to evaluate a short paragraph of text using IQ Evaluate:

  • input tokens = 400
  • output tokens = 500

Total cost incurred = 400 tokens/1000 * $0.008 + 500 tokens/1000 * $0.024 = $0.0032 + $0.012 = $0.0152

Setup & Readiness

To enable IQ, log into NEONNOW Admin, and review the Ready Checklist. This will guide you through the required steps.

The three checks are as follows:

  1. The existing NEONNOW service role must be extended to allow orchestration of Bedrock Agents, as well as to allow self-awareness for role version checks. It is recommended to copy the role policy as shown in NEONNOW Admin into inline policy of the role. An example is shown below for informational purposes only (please ensure the policy is copied from NEONNOW Admin).
 {
"Sid": "CheckOwnRole",
"Effect": "Allow",
"Action": [
"iam:GetRolePolicy",
"iam:ListRolePolicies"
],
"Resource": [
"arn:aws:iam::<AWS Account>:role/<unique-id>-client-apiExeRole"
]
},
{
"Sid": "CheckOtherCWRoles",
"Effect": "Allow",
"Action": [
"iam:ListRoles"
],
"Resource": [
"arn:aws:iam::<AWS Account>:role/"
]
},
{
"Sid": "PassIQServiceRole",
"Effect": "Allow",
"Action": [
"iam:PassRole"
],
"Resource": [
"arn:aws:iam::<AWS Account>:role/neonnow-iq-service-role"
]
}
  1. The current model (LLM) must be enabled - the LLM model required will be shown in the mouse-over of the 'Model Enabled' item. If this step shows a warning, select the 'Enable' button to be taken to Model page, where a use case may need to be submitted.

Submit the following text in the use case:

  • Company Name: enter your company name
  • Company website URL: enter your company website
  • What industry do you operate in?: select your company's industry from drop-down
  • Who are the intended users you are building for?: Select 'Internal users'
  • Describe your use cases: we will be using the model to summarize, rewrite, and predict text for our contact Centre Application NEONNOW which integrates to Amazon Connect, and uses various AWS services included Bedrock agent.

Uses cases are normally approved within 10-15 minutes.

  1. A new IQ Service role must be created (neonnow-iq-service-role), to allow orchestration of Bedrock Agents. This is the role referenced in the main service role extension described above. If the role is not created, select 'Fix'. This will show the launch screen to run a CloudFormation to create the new Service Role. Select 'Create AWS Role' on the modal window to create the role. Select the I acknowledge that AWS CloudFormation might create IAM resources with custom names. tick box and select 'Create stack'. Once the role is created, review NEONNOW Admin readiness - all steps should now show as ready. The Service Role is shown below for informational purposes only.

{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "arn:aws:bedrock:*::foundation-model/*",
"Effect": "Allow",
"Sid": "AmazonBedrockAgentBedrockFoundationModelPolicy"
},
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel*",
"bedrock:CreateInferenceProfile"
],
"Resource": [
"arn:aws:bedrock:*::foundation-model/*",
"arn:aws:bedrock:*:*:inference-profile/*",
"arn:aws:bedrock:*:*:application-inference-profile/*"
]
},
{
"Effect": "Allow",
"Action": [
"bedrock:GetInferenceProfile",
"bedrock:ListInferenceProfiles",
"bedrock:DeleteInferenceProfile",
"bedrock:TagResource",
"bedrock:UntagResource",
"bedrock:ListTagsForResource"
],
"Resource": [
"arn:aws:bedrock:*:*:inference-profile/*",
"arn:aws:bedrock:*:*:application-inference-profile/*"
]
}
]
}

In addition, select the Output Locale, which will determine the text locale returned from various IQ service.