Introduction to AI Insights

This feature is currently in Beta as we are working to improve and integrate new tools to it.
AI generated insights aim to automatically bring to you the most relevant information based on the communications you entered in Borealis. This feature allows you to quickly have a look at what has been happening recently and easily take action to shape your stakeholder engagement strategy.

AI Insights are available from:

  • Individuals
  • Organisations
  • Communications
  • Engagement plans

Insights Features

Examples of Use Cases

  • You need to get in touch with a stakeholder with whom you have not had any contact before or with whom you have not been in touch with for months. You can then consult insights to quickly be brought up-to-date with the latest communication, the general tone of those communications and the important information communicated within those communications.
  • Several communications have been entered into the system in relation to a stakeholder you need to get in touch with, and you need to get a "pulse" on the relationship between your organization and this stakeholder. You can review the insights and assess the situation before reaching out to them.

Additional Information on AI

  • The AI used to generate our insights is designed to adapt to different texts from our customers and assist them with their work. As such, there are no restrictions or suggested parameters to optimize the way insights are generated.
  • At this time, Borealis does not offer the ability to use prompts to generate specific insights. While this feature may be added in the future, you cannot generate insights related to specific things or by interacting with the AI directly. This includes adding specific parameters such as minimum or maximum length, number of words, or specific usage of words.
  • AI insights are generated by asking 3 to 4 questions to GPT-4o mini, using communications and summaries when they are created and each time they are edited. The AI insight is then stored in the database for future access. The GPT models used are hosted by Azure and we have instance in each of our active customer base geographies (Canada, Paris, and Australia).
  • It is important to note that we do not use data to train our AI models. Instead, we use GPT APIs to ask the questions using static context along with information contained in user communications. The Open AI model is pre-trained and does not require further training from us.
  • Our new AI features have undergone an in-depth risk review by our security team as well as our in-house lawyer. This has been done to ensure that our customer data remains secure when using these GPT APIs at Azure.
  • We do not have a direct way to influence future results of our Insights generation. We have integrated mechanism to tag AI insights as irrelevant when needed. These tags will be used to improve future usability and reduce the likelihood of irrelevant results.
  • Currently, our AI insights are mainly generated around To-dos and negative tones. In the future, we may integrate other types of AI Insights.
Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request