Business-ready
Top companies listed in the Fortune 500 have embraced Semantic Kernels capabilities, for its adaptability and ease of monitoring usage patterns. Successfully Incorporating security features, like telemetry assistance and adaptable hooks and filters ensures the implementation of Business AI solutions at a large scale
Version 1.0+ of Semantic Kernel provides support for C#, Python and Java while guaranteeing stability and a dedication to making changes without causing disruptions. Existing chat-based APIs, including AI-powered chat applications, have the capability to smoothly integrate elements such as voice and video functionalities.
Crafted for the future of us all is Semantic Kernel. Seamlessly meshed with cutting edge AI models that flexibly adjust as technology progresses forward into territory.When novel models surface you have the freedom to seamlessly interchange them without revamping your code repository.
Automating business processes
Semantic Kernel connects prompts to established APIs to smoothly carry out tasks in a manner. By giving AI models details about your code structure, the models can effectively activate the functions to meet demands. When a request is initiated, the model prompts a function. Semantic Kernel serves as the bridge between the model’s request and the function execution, making it a crucial component of AI-based business automation, resulting in the model receiving the outcomes.
Adaptable and extendable
By converting your code into a plugin and integrating AI services smoothly with pre made connectors you can enhance your investment efficiently The Semantic Kernel incorporates OpenAPI specifications, like Microsoft 365 Copilot enabling effortless extension sharing with professional and low code developers, in your company.
Components
When creating a system using Semantic Kernel technology different parts can be employed to improve the user experience of the application.
Not all parts are necessary but its good to know them for an understanding of the topic, at hand Here are some essential components that form the Semantic Kernel:
Kernel
The name alone suggests its importance, in the SDK.
The core acts as the center where all connectors and add plugins are and necessary settings are set up for the software to operate smoothly.
Moreover it allows for logging and telemetry support which helps in monitoring the programs status and performance as aiding in debugging when necessary.
Memories
Lets now talk about the function that allows us to give context to user queries, in our Plugin setup making it possible for the Plugin to refer back to conversations, with the user so as to grasp the context of their query more effectively. There are three methods through which these past interactions can be integrated:
- Key-value pairs: In this approach of storing data as key value pairs to how environment variables stored and accessed conventionally requires a clear one to one correlation between the key and the user input text.
- Local storage: The information is saved in a file on the local file system. When the key-value storage grows large, it’s time to transition to this storage type.
- Semantic in-memory search: Here, information is represented as numerical vectors, known as embeddings, to facilitate the search.
Planner
When a user provides input a planner is used to create a plan to meet their needs in the SDK, which includes designed options to select from:
- SequentialPlanner: When creating a SequentialPlanner systematics is vital as it involves devising a strategy that links functions together while keeping track of their input and output parameters.
- BasicPlanner: A simplified form of the SequentialPlanner that connects functions in a sequence.
- ActionPlanner: ActionPlanner creates a strategy focused on a task
- StepwisePlanner: The StepwisePlanner executes each task gradually by evaluating the outcomes before moving on to the one.
Connectors
In the Semantic Kernel system connections play a role by acting as links, between parts and enabling the smooth flow of information among them.
These connections can be created to interact with systems, like linking up with HuggingFace models or using a database as storage, for development needs.
The Semantic Kernel SDK comes with set connectors that can be grouped into two primary categories:
Integration with AI models:
- HuggingFace: Microsoft.SemanticKernel.Connectors.AI.HuggingFace
- Oobabooga: Microsoft.SemanticKernel.Connectors.AI.Oobabooga
- OpenAI: Microsoft.SemanticKernel.Connectors.AO.OpenAI
Support for existing RDBMS and NoSQL Memories:
- Chroma
- Azure Cognitive Search
- DuckDB
- Kusto
- Postgres
- Milvus
- Qdrant
- Weaviate
- Redis
- SQLite
- Pinecone
Plugins
In the realm of AI services and applications a plugin encompasses a set of functions be they inherent or semantic, in nature that are accessible and utilized.
Merely writing these functions isn’t enough. We need to make sure we can describe their behavior in detail with information, on what they take in and put out and any impacts they might have.
It’s crucial at this point to distinguish between two types of functions:
- Semantic Functions: These particular functions decode what users ask for and form replies in a way that sounds natural, to humans. They count on connectors to do their job.
- Native Functions: Created with programming languages such, as C# Python and Java (currently being tested) these features handle tasks that AI models may not be well suited for.
- Doing math calculations
- Saving information in memory.
- Exploring REST APIs.
Planners can gain insights into the functionality of functions by incorporating annotations, in the code itself of relying on configuration files.