Skip to content
Free Tool Arena

AI & LLMs · Guide · AI & Prompt Tools

How to Use Semantic Kernel

Installing Semantic Kernel in C#/Python/Java, Kernel, plugins, planners, memory, and agent framework.

Updated April 2026 · 6 min read

Semantic Kernel is Microsoft’s open-source SDK for orchestrating LLMs, plugins, and planners in C#, Python, or Java.

Advertisement

Where LangChain optimises for breadth and experimentation, Semantic Kernel targets enterprise apps: strong typing, dependency injection, telemetry, and first-class support for Azure OpenAI. It’s the framework powering much of Microsoft’s own Copilot surface.

What it is

Semantic Kernel exposes a Kernel object that wires together AI services (chat, embeddings, image), plugins (callable functions the model can invoke), memory (vector stores), and planners that turn a goal into a sequence of function calls. It’s available as NuGet, PyPI, and Maven packages with near-parity across languages.

Install / sign up

# Python
pip install semantic-kernel

# .NET
dotnet add package Microsoft.SemanticKernel

# Java
# Add to pom.xml:
# <dependency>com.microsoft.semantic-kernel:semantickernel-api</dependency>

# Need an OpenAI or Azure OpenAI key

First session

Create a Kernel, register a chat service, and add a plugin. The model can then call your plugin functions automatically when it decides they’re relevant.

$ python
import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

kernel = sk.Kernel()
kernel.add_service(OpenAIChatCompletion("gpt-4o", api_key))
kernel.add_plugin(parent_directory="./plugins", plugin_name="Weather")

reply = await kernel.invoke_prompt("What's the weather in Oslo?")
print(reply)

Everyday workflows

  • 1. Wrap existing REST APIs as plugins — the model will call them via function calling when appropriate.
  • 2. Use the Handlebars or Stepwise planner to decompose complex goals into ordered plugin calls.
  • 3. Plug in a memory store (Azure AI Search, Qdrant, Redis) for retrieval-augmented chat.

Gotchas and tips

Semantic Kernel leans heavily on dependency injection; in .NET especially, register services on the host builder rather than newing up a Kernel manually — you’ll get proper logging and configuration. Use the OpenTelemetry integration early so you can debug long plugin chains.

Planners can burn tokens quickly; prefer explicit function composition when the workflow is known and reserve planners for open-ended goals. The Python and .NET SDKs occasionally drift — pin versions in production and check release notes for breaking changes in the preview packages.

Who it’s for

Enterprise teams building Copilot-style features, especially on Azure, who want a supported SDK with strong typing and observability instead of a research-grade framework.

Advertisement

Found this useful?Email