From 8113d4e85cca60ed507bed5d70a9fab9be983ee8 Mon Sep 17 00:00:00 2001 From: IvetNikolova Date: Thu, 8 May 2025 18:01:29 +0300 Subject: [PATCH 1/8] Create AI-powered-insights.md --- interactivity/AI-powered-insights.md | 60 ++++++++++++++++++++++++++++ 1 file changed, 60 insertions(+) create mode 100644 interactivity/AI-powered-insights.md diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md new file mode 100644 index 000000000..9dbc189ba --- /dev/null +++ b/interactivity/AI-powered-insights.md @@ -0,0 +1,60 @@ +--- +title: AI-powered insights in Report Preview +page_title: AI-powered insights in Report Preview +description: "Learn how to implement a prompt UI as part of the Web report viewer" +slug: telerikreporting/designing-reports/adding-interactivity-to-reports/ai-powered-insights +tags: telerik, reporting, ai, +published: True +position: 1 +--- + +# AI-powered insights Overview + +The AI-powered insights in Report Preview provide comprehensive capabilities, including response generation, prompt creation, AI output interaction, and execution of predefined commands. + +## OpenAI Implementations + +* Ask AI: This functionality enables users to pose questions to the AI, facilitating interactive and dynamic responses based on the provided document context. + +* Output: This feature generates outputs from the AI, including summaries, highlights, and other predefined commands, enhancing the overall productivity and efficiency of the report viewer. + +## Configure the AI + +| | | +| ------ | ------ | +|friendlyName|| +|model|| +|endpoint|| +|credential|| +|allowOnlyPredefinedPrompts|This setting is set to false by default. If you set it to true you will not be allowed to ask anything except the predefined prompts. For example, if you write "Hi" it will throw an exception| +|predefinedPrompts|| + +````JSON +"AIClient": { + "friendlyName": "MicrosoftExtensionsAzureOpenAI", + "model": "gpt-4o-mini", + "endpoint": "https://ai-explorations.openai.azure.com/", + "credential": "", + "allowOnlyPredefinedPrompts": true, + "predefinedPrompts": [ + { "text": "Prompt 1" }, + { "text": "Prompt 2" } + ] +} +```` + +````XML + + + + + + + + +```` \ No newline at end of file From 3f3512f7febdbfe18f042c408f21835c2c482661 Mon Sep 17 00:00:00 2001 From: IvetNikolova <118352332+IvetNikolova@users.noreply.github.com> Date: Fri, 9 May 2025 10:39:33 +0300 Subject: [PATCH 2/8] Updated AI-powered-insights.md --- interactivity/AI-powered-insights.md | 29 +++++++++++++++++++--------- 1 file changed, 20 insertions(+), 9 deletions(-) diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md index 9dbc189ba..15a9f9517 100644 --- a/interactivity/AI-powered-insights.md +++ b/interactivity/AI-powered-insights.md @@ -20,14 +20,25 @@ The AI-powered insights in Report Preview provide comprehensive capabilities, in ## Configure the AI -| | | +| Setting | Description | | ------ | ------ | -|friendlyName|| -|model|| -|endpoint|| -|credential|| -|allowOnlyPredefinedPrompts|This setting is set to false by default. If you set it to true you will not be allowed to ask anything except the predefined prompts. For example, if you write "Hi" it will throw an exception| -|predefinedPrompts|| +|friendlyName|This setting specifies the name corresponding to the type of AI client you wish to use. For example, setting friendlyName to "MicrosoftExtensionsAzureOpenAI" indicates that the Azure OpenAI client is being utilized| +|model|This setting specifies the AI model to be used for generating responses. For example, setting the model to "gpt-4o-mini" indicates that the GPT-4 model variant is being utilized| +|endpoint|This setting specifies the URL of the AI service endpoint| +|credential|This setting specifies the authentication credentials required to access the AI service. It ensures that the AI client can securely connect to the specified endpoint| +|allowOnlyPredefinedPrompts|This setting is set to false by default. If you set it to `True`, you will not be allowed to ask anything except the predefined prompts. For example, if you write "Hi" it will throw an exception| +|predefinedPrompts|This setting specifies a list of predefined prompts that the AI client can use. Each prompt is defined by a text attribute, which contains the prompt's content| + +__AI clients__ + +We have four available options for the `friendlyName` setting + +| | | +| ------ | ------ | +|Microsoft.Extensions.AzureAIInference.Core|DefaultProviderInfoFactory.MicrosoftExtensionsAzureAIInference| +|Microsoft.Extensions.AzureOpenAI.Core|DefaultProviderInfoFactory.MicrosoftExtensionsAzureOpenAI| +|Microsoft.Extensions.AzureOllama.Core|DefaultProviderInfoFactory.MicrosoftExtensionsOllama| +|Microsoft.Extensions.OpenAI.Core|DefaultProviderInfoFactory.MicrosoftExtensionsOpenAI| ````JSON "AIClient": { @@ -46,7 +57,7 @@ The AI-powered insights in Report Preview provide comprehensive capabilities, in ````XML -```` \ No newline at end of file +```` From 04d3db62e61af44c9a3916744d51bc663e3eefbb Mon Sep 17 00:00:00 2001 From: IvetNikolova <118352332+IvetNikolova@users.noreply.github.com> Date: Fri, 9 May 2025 10:56:26 +0300 Subject: [PATCH 3/8] Updated AI-powered-insights.md --- interactivity/AI-powered-insights.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md index 15a9f9517..27b6f3348 100644 --- a/interactivity/AI-powered-insights.md +++ b/interactivity/AI-powered-insights.md @@ -69,3 +69,10 @@ We have four available options for the `friendlyName` setting ```` +## Extensibility + +* If the existing clients are insufficient, custom implementations can be developed and added to the REST service configuration + +* Predefined prompts in the REST service controllers can be intercepted and modified + +* Certain metadata sent to AI models can be altered through resx files From a6b5c7e4e144c3584b0b63244da95808f2c2c0a0 Mon Sep 17 00:00:00 2001 From: IvetNikolova <118352332+IvetNikolova@users.noreply.github.com> Date: Fri, 9 May 2025 11:37:44 +0300 Subject: [PATCH 4/8] Updated AI-powered-insights.md --- interactivity/AI-powered-insights.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md index 27b6f3348..146296627 100644 --- a/interactivity/AI-powered-insights.md +++ b/interactivity/AI-powered-insights.md @@ -35,10 +35,10 @@ We have four available options for the `friendlyName` setting | | | | ------ | ------ | -|Microsoft.Extensions.AzureAIInference.Core|DefaultProviderInfoFactory.MicrosoftExtensionsAzureAIInference| -|Microsoft.Extensions.AzureOpenAI.Core|DefaultProviderInfoFactory.MicrosoftExtensionsAzureOpenAI| -|Microsoft.Extensions.AzureOllama.Core|DefaultProviderInfoFactory.MicrosoftExtensionsOllama| -|Microsoft.Extensions.OpenAI.Core|DefaultProviderInfoFactory.MicrosoftExtensionsOpenAI| +|Microsoft.Extensions.AI.AzureAIInference|"MicrosoftExtensionsAzureAIInference"| +|Microsoft.Extensions.AI.OpenAI + Azure.AI.OpenAI|"MicrosoftExtensionsAzureOpenAI"| +|Microsoft.Extensions.AI.Ollama|"MicrosoftExtensionsOllama"| +|Microsoft.Extensions.AI.OpenAI|"MicrosoftExtensionsOpenAI"| ````JSON "AIClient": { From 46808b8b771a4954d121bd4eb70c9dee238fbff2 Mon Sep 17 00:00:00 2001 From: IvetNikolova <118352332+IvetNikolova@users.noreply.github.com> Date: Fri, 9 May 2025 14:14:25 +0300 Subject: [PATCH 5/8] Updated the Extensibility section --- interactivity/AI-powered-insights.md | 19 ++++++++++++++++--- 1 file changed, 16 insertions(+), 3 deletions(-) diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md index 146296627..daf6609a9 100644 --- a/interactivity/AI-powered-insights.md +++ b/interactivity/AI-powered-insights.md @@ -71,8 +71,21 @@ We have four available options for the `friendlyName` setting ```` ## Extensibility -* If the existing clients are insufficient, custom implementations can be developed and added to the REST service configuration +* If necessary, the Reporting engine can use a custom Telerik.Reporting.AI.IClient implementation, which can be registered in the Reporting REST Service configuration: -* Predefined prompts in the REST service controllers can be intercepted and modified +````C# +builder.Services.TryAddSingleton(sp => new ReportServiceConfiguration +{ + HostAppId = "MyApp", + AIClientFactory = GetCustomAIClient, + ... +}); +static Telerik.Reporting.AI.IClient GetCustomAIClient() +{ + return new MyCustomAIClient(...); +} +```` + +* The configured predefined prompts can be modified at runtime by overriding the "UpdateAIPrompts" method of the ReportsController class. -* Certain metadata sent to AI models can be altered through resx files +* Parts of the metadata sent by the Reporting engine to the AI model can be customized through resx files From 3cb40d0e299be0a3cf81b54baa3a888a29c9084d Mon Sep 17 00:00:00 2001 From: IvetNikolova <118352332+IvetNikolova@users.noreply.github.com> Date: Fri, 9 May 2025 16:22:17 +0300 Subject: [PATCH 6/8] Updated snippets --- interactivity/AI-powered-insights.md | 55 ++++++++++++++++------------ 1 file changed, 31 insertions(+), 24 deletions(-) diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md index daf6609a9..7dda9a891 100644 --- a/interactivity/AI-powered-insights.md +++ b/interactivity/AI-powered-insights.md @@ -26,7 +26,8 @@ The AI-powered insights in Report Preview provide comprehensive capabilities, in |model|This setting specifies the AI model to be used for generating responses. For example, setting the model to "gpt-4o-mini" indicates that the GPT-4 model variant is being utilized| |endpoint|This setting specifies the URL of the AI service endpoint| |credential|This setting specifies the authentication credentials required to access the AI service. It ensures that the AI client can securely connect to the specified endpoint| -|allowOnlyPredefinedPrompts|This setting is set to false by default. If you set it to `True`, you will not be allowed to ask anything except the predefined prompts. For example, if you write "Hi" it will throw an exception| +|requireConsent|A boolean configuration switch that determines whether users must explicitly consent to the use of AI models before the AI report insights features can be utilized within the application| +|allowCustomPrompts|This setting is set to false by default. If you set it to `True`, you will not be allowed to ask anything except the predefined prompts. For example, if you write "Hi" it will throw an exception| |predefinedPrompts|This setting specifies a list of predefined prompts that the AI client can use. Each prompt is defined by a text attribute, which contains the prompt's content| __AI clients__ @@ -41,33 +42,39 @@ We have four available options for the `friendlyName` setting |Microsoft.Extensions.AI.OpenAI|"MicrosoftExtensionsOpenAI"| ````JSON -"AIClient": { - "friendlyName": "MicrosoftExtensionsAzureOpenAI", - "model": "gpt-4o-mini", - "endpoint": "https://ai-explorations.openai.azure.com/", - "credential": "", - "allowOnlyPredefinedPrompts": true, - "predefinedPrompts": [ - { "text": "Prompt 1" }, - { "text": "Prompt 2" } - ] +{ + "telerikReporting": { + "AIClient": { + "friendlyName": "MicrosoftExtensionsAzureOpenAI", + "model": "gpt-4o-mini", + "endpoint": "https://ai-explorations.openai.azure.com/", + "credential": "...", + "requireConsent": false, + "allowCustomPrompts": false, + "predefinedPrompts": [ + { "text": "Prompt 1" }, + { "text": "Prompt 2" } + ] + } + } } ```` ````XML - - - - - - - - + + + + + + + + ```` ## Extensibility From 849536dde0c28875cbf9f7236b2df14dd93b1314 Mon Sep 17 00:00:00 2001 From: IvetNikolova <118352332+IvetNikolova@users.noreply.github.com> Date: Fri, 9 May 2025 17:51:42 +0300 Subject: [PATCH 7/8] Update AI-powered-insights.md --- interactivity/AI-powered-insights.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/interactivity/AI-powered-insights.md b/interactivity/AI-powered-insights.md index 7dda9a891..b34587409 100644 --- a/interactivity/AI-powered-insights.md +++ b/interactivity/AI-powered-insights.md @@ -94,5 +94,3 @@ static Telerik.Reporting.AI.IClient GetCustomAIClient() ```` * The configured predefined prompts can be modified at runtime by overriding the "UpdateAIPrompts" method of the ReportsController class. - -* Parts of the metadata sent by the Reporting engine to the AI model can be customized through resx files From cbd1a7901e1b1b81b7f515108f23dd2d28bf7c11 Mon Sep 17 00:00:00 2001 From: Momchil Zanev Date: Fri, 9 May 2025 18:38:43 +0300 Subject: [PATCH 8/8] chore: added aiclient-element.md --- .../aiclient-element.md | 101 ++++++++++++++++++ 1 file changed, 101 insertions(+) create mode 100644 doc-output/configure-the-report-engine/aiclient-element.md diff --git a/doc-output/configure-the-report-engine/aiclient-element.md b/doc-output/configure-the-report-engine/aiclient-element.md new file mode 100644 index 000000000..d0e91fb36 --- /dev/null +++ b/doc-output/configure-the-report-engine/aiclient-element.md @@ -0,0 +1,101 @@ +--- +title: AIClient Element +page_title: AIClient Element Configuration +description: "Learn how to utilize the AIClient Element to configure the AI model used for GenAI-powered insights during report preview" +slug: telerikreporting/aiclient-element +tags: aiclient,element, ai +published: True +position: 13 +--- + + + +# AIClient Element Overview + +The `AIClient` element specifies the configuration settings for the GenAI-powered insights functionality of Telerik Reporting. It is used to connect the Reporting engine to a local or remote LLM, as well as configure the behavior of the built-in Reporting AI capabilities. + +## Attributes and Elements + +__`` element__ + +| | | +| ------ | ------ | +|Attributes|
  • __friendlyName__ - Required string attribute. Specifies the name that corresponds to the type of AI client to be used. The names of the currently supported AI client types are: `MicrosoftExtensionsAzureAIInference`, `MicrosoftExtensionsAzureOpenAI`, `MicrosoftExtensionsOllama`, and `MicrosoftExtensionsOpenAI`.
  • __model__ - Required string attribute. Specifies the AI model to be used for generating responses. For example, setting the model to "gpt-4o-mini" indicates that the GPT-4 model variant is being utilized.
  • __endpoint__ - Optional string attribute. If set, specifies the URL of the AI service endpoint.
  • __credential__ - Optional string attribute. If set, specifies the authentication credentials used to access the AI service.
  • __requireConsent__ - Optional boolean attribute _(true by default)_. Determines whether users must explicitly consent to the use of AI services before the AI report insights features can be utilized within the application.
  • __allowCustomPrompts__ - Optional boolean attribute _(true by default)_. Determines whether users are allowed to freely communicate with the AI model. If the switch is set to false, custom queries are forbidden and only the predefined prompts can be used.
| +|Child Elements|
  • __predefinedPrompts__ - Optional element. Defines a list of predefined prompts that the AI client can use.
| +|Parent Element|__Telerik.Reporting__ - Configures all settings that the Telerik Reporting Engine uses.| + +__`` element__ + +| | | +| ------ | ------ | +|Attributes|None| +|Child Elements|
  • __add__ - Optional element. Adds a prompt to the list of predefined prompts.
| +|Parent Element|__AIClient__| + +__`` element__ + +| | | +| ------ | ------ | +|Attributes|__text__ - The text of a predetermined AI prompt.| +|Child Elements|None| +|Parent Element|__predefinedPrompts__| + +## Example + +The following code example demonstrates how to configure the Reporting engine with an Azure OpenAI client that uses the `gpt-4o-mini` model. In addition, the AI functionality is restricted to using only a couple of predefined prompts for summarizing and translating the report. + +XML-based configuration file: + +````XML + + + +
+ + + + + + + + + +... + +```` + +JSON-based configuration file: + +````JSON +"telerikReporting": { + "AIClient": { + "friendlyName": "MicrosoftExtensionsAzureOpenAI", + "model": "gpt-4o-mini", + "endpoint": "https://ai-explorations.openai.azure.com/", + "credential": "...", + "requireConsent": true, + "allowCustomPrompts": false, + "predefinedPrompts": [ + { "text": "Generate an executive summary of this report." }, + { "text": "Translate the document in German." } + ] + } +} +```` + +> When adding the `Telerik.Reporting` section manually, do not forget to register it in `configSections` element of the configuration file. Failing to do so will result in a [ConfigurationErrorsException](https://learn.microsoft.com/en-us/dotnet/api/system.configuration.configurationerrorsexception?view=dotnet-plat-ext-7.0) with the following text: *Configuration system failed to initialize*. + +## See Also