With integration to OpenAI on Azure, Tikit Virtual Agent can provide generative AI responses based on documentation you upload. Generative responses use your uploaded documentation as a means to “learn” how to best provide a response. With generative responses, answers are unique and vary based on slight differences in how you prompt Tikit Virtual Agent.
Generative responses will be used when no configured Knowledge is found in your Knowledge Base. This ensure responses you’ve configured (KB or Templates) are returned first before a generative response is attempted to be returned.
OpenAI on Azure is deployed in your tenant, so your data stays with you. This article will walk you through registration, deployment, and configuration. This integration makes use of three different Azure resources - OpenAI, Azure Search Service, and Azure Blob Storage.
Table of Contents:
One Click Deploy
Using the following button, you can deploy Tikit's OpenAI integration quickly. This deployment utilizes the lowest cost options for OpenAI, Search Service, and Storage.
These three resources total approximately $76 per month. Pricing details are subject to change by Microsoft and should be validated after deployment. OpenAI tokens are not included in this estimate, more on these estimates can be found at the bottom of this article.
Resources can always be scaled up later if necessary
Choose your Subscription, and a Resource Group where these resources will be created. Your Resource Group will automatically determine the Region. Then choose the Resource Location, this is where OpenAI, Azure Search, and the Storage Account will be deployed to.
Once you've provided names for these resources, click on Review + Create and then finally Create. The deployment will begin and in couple of minutes it should be done.
Once it completes, head over to the Outputs tab.
Here you'll be able to copy and paste the majority of values required into Tikit's Azure AI Settings page. The only setting that cannot be copied is the Index Name.
At this point, we're almost done with integration, next we'll train OpenAI around documentation that you upload.
Creating the Index and Uploading Documentation
At this point, we have a couple of to choose from to train OpenAI on our documentation. The two most noteworthy are:
One time File Upload
fastest to setup, but requires this step to be repeated every time you want to add documentation. Existing documentation must also be uploaded to maintain existing knowledge.
Upload into Blob Storage with support for Retraining
slower to setup, but better long term functionality as you can schedule re-training e.g. Hourly or Daily, push uploads via Azure Portal and/or Power Automate based on when documentation is created and/or modified. Existing documentation does not have to be re-uploaded.
We're going to choose One Time File Upload to get some initial items setup, and then we can easily transition into Upload into Blob Storage
Click on the Overview tab for your Deployment and click "Go to Resource Group"
NOTE: If you lose your place and want to see the Outputs tab, navigate into the Resource Group and click the link next to Deployments on the Overview tab.
One Time File Upload
Next, click into your Azure OpenAI resource by clicking its Name.
Then click on Explore Azure AI Foundry portal
Next, it’s time to upload documents you want to use to train Open AI. In Chat, select “Add your data.”
3. In the “Add data” menu:
– Select data source: Upload files (preview)
– Subscription: Choose your Azure subscription
– Select Azure Blog storage resource: Select the Resource you created in the Deployment
- If you are prompted to enable CORS, click the button to enable it
– Select Azure AI Search resource: Select the Resource you created in the Deployment
– Index Name: tikitopenaisearch
– If you choose to call this something else, take note of it as it is the last value to copy into the Tikit Azure AI Settings page.
4. Upload your documentation. This can be text, html, markdown, pdfs, Word documents, or PowerPoints. Each file can be no greater than 16mb.
5. Data Management
– Search Type: Keyword
- While you can choose Semantic, please be aware there are additional Azure costs associated with this. Pricing details can be found here - https://azure.microsoft.com/en-us/pricing/details/search/
– Chunk Size: 1024
6. Next, select “API Key” and click Next
7. Finally confirm your configuration and select “Save and close”.
Azure Foundry Portal will provide a status message letting your know document ingestion and begun. Wait until this finishes before testing out questions in it's Chat Studio or setting up re-occuring training next.
Upload into Blob Storage with Support for Retraining
Once document ingestion is complete, you can perform the following steps to move from a one-time file upload to re-occurring training.
If you're looking to:
Upload more documentation
Remove documentation
Run training on a schedule
Retrain OpenAI on demand
Automate file uploads from OneDrive or Sharepoint
Check out this related KB Article!
Integrate OpenAI
You're almost done!
To configure integration with OpenAI on Azure. You’ll need to copy one last piece of information from above into the Settings page and choose if you'd like File Citations to be included in results.
1. Copy the Index Name that you used above into the Tikit Azure AI page.
Choose if you wish generative answers to provide downloadable links to articles the response was based off select “Allow citation file downloads”.
Open AI on Azure is now processing your data and preparing to respond with generative answers. You can test this out within Chat Playground. Ask a question about some of the documentation you uploaded. The following image is an example wherein documentation about Tikit’s Quick Start guide was uploaded.
Estimating Token Cost
We can also use this example to approximate the cost given the above configuration where 1 Token is approximately 4 characters. Please note that costs will vary and the following is only an approximation. The most up to date pricing information is available on Microsoft's pricing page as seen here.
– $0.0000004 per input token ($0.0000004 x 57 = $0.0000228)
– $0.0000016 per output token ($0.0000016 x 348 = $0.0005568)
Change the System Prompt
Once the integration with OpenAI on Azure has been setup. You can optionally alter the System Prompt. You can think of the System Prompt as the instructions that are always provided to your OpenAI instance. By default the System Prompt is:
You are an AI assistant that helps people find information.
But this message could be altered or added to. For example, let's say you wanted to slightly modify the tone and/or personality of generative AI responses:
Answers you provide should be truthful but said in a light hearted tone
Alternatively, you provide very direct instructions:
If you are asked who you are, respond by saying "I'm Tikit Virtual Agent of course!" If you are asked how you know what you know, respond that your knowledge is based entirely upon what has been uploaded.
The System Prompt will count towards the total number of tokens used per interaction. Keep this in mind as you construct your own System Prompts.