Let’s go with the Go programming language
What is the Go programming language? Avenga explains use cases of applying it for speedy and efficient software development.
News from the three leading tech conferences distilled into one dispatch.
The year is halfway over, and the major tech conferences on Microsoft 365 (M365) for 2024 have run their course. Marcin Wojciechowski, our M365 Competency Center Leader, participated in several events these months. CollabDays, InfoShare, and MS Tech Summit offered an insight into the recent updates in the tech community. In this firsthand account, Marcin reflects on the popularity and capabilities of Large Language Models (LLMs), shares exciting news about SharePoint Embedded, and discusses the peculiarities of Microsoft Power Platform. Whether you joined the conferences or missed out entirely, here is a breakdown of essential points.
Let’s face it: AI is everywhere. LLMs are one of the most important features ever implemented. But the buzz surrounding them is reminiscent of past technological hype stories, like blockchain, big data, and Web3. While these technologies promised revolutionary change, their impact was often limited. Will AI follow the same trajectory, or is this time truly different?
Explore how we helped a global manufacturing company migrate to Microsoft 365. Success story
Let’s take a closer look at these stories. While Web3 and blockchain generated immense hype, their real-world applications remained limited. Blockchain was the cornerstone of Web3, but its use cases outside of that were often outshone by existing technologies. LLMs, on the other hand, exhibit far broader potential.
”To my mind, LLM can perform what I would call a few years back a nondeterministic step, meaning it can get structured data from unstructured input. Of course, it can also do much more — the fact that you can talk with a bot is absolutely amazing, but the next steps are even more exciting,” Marcin shares.
Even now, we can already use Retrieval Augmented Generation (RAG) to teach language models of our data. Another important thing is Semantic Kernel, a feature that allows you to register methods that LLMs can use on the fly. Both RAG and Semantic Kernel hint at what can come in the future.
This brings us to the next frontier: AI Agents.
”I predict a major focus on their development over the next few months (or even years). While Rabbit1 is already promoting its “Large Action Model (LAM) product” with much fanfare, I believe truly production-ready agents, ones that match their lofty claims, are still on the horizon. The most intriguing advancements in the coming months will likely be around implementing effective Action Models for these agents,” he sums up.
So, what exactly is an Action Model? Simply put, it’s the mechanism that allows AI agents to interact with various applications (even via their APIs) using natural language.
You might wonder: what’s the difference between an Agent with an Action Model and an Assistant powered by a RAG (Retrieval-Augmented Generation) LLM? While both can retrieve tasks from a service like Microsoft To Do, the Agent goes further, which allows users to update or create new tasks directly within the conversation. This capability extends to other APIs (both internal and external) and opens up new possibilities for automating actions and integrating AI into everyday workflows.
One major hurdle to the widespread adoption of AI is cost. LLMs remain expensive, and this may be their most striking similarity with Big Data. The rise of lightweight LLMs and libraries like LLamaSharp offers a glimmer of hope, but the overall price tag remains a barrier.
This challenge may lead to a resurgence of desktop applications. As laptops become more powerful, running AI locally becomes increasingly feasible. This approach shifts the computational burden from the cloud to the client and can potentially reduce costs.
Marcin explains, “Instead of buying expensive OpenAI cloud services, we can distribute an app with local LLM and transform the computation effort to the client side. We can use Semantic Kernel to build Agents, which can communicate with company services. There is also an opportunity to use Semantic Kernel memory to create a local RAG that can be supplemented with, for example, communication with M365 Semantic Index, which is part of M365 Copilot. It is possible to generate keywords out of prompt and call SharePoint Search API even without an M365 Copilot license.”
SharePoint Embedded and Fluid Framework 2.0 are approaching widespread availability. While SharePoint Embedded has been already, Fluid Framework 2.0 will follow suit this summer.
He shares excitement about this news, “Next may be a bit of wishful thinking, but SharePoint Embedded and Fluid Framework 2.0 is a piece of magic! I do believe we will see more and more Document Management Systems built with SharePoint Embedded.”
SharePoint Embedded (SPE) has two major advantages:
SharePoint Embedded and Fluid Framework 2.0 can seamlessly integrate document collaboration into applications. This allows developers to build applications that leverage SharePoint’s file storage and collaboration capabilities and also provide real-time, multi-user editing experiences powered by Fluid Framework.
“Once again, we have all the good things from the M365 container here: full flexibility with UX, as well as scalable co-authoring and coediting with Fluid Framework. All the best components of a great Content Management System (CMS)!” Marcin adds.
Power Platform is still on the rise. The platform’s combination of Power Apps, Power Automate, Power BI, and Power Pages empowers both citizen developers and professional developers to rapidly build and deploy custom applications, automate workflows, create insightful dashboards, and design engaging web pages.
Marcin continues, “Let’s be honest — that’s the best low/no-code platform on the market. Great integration (not only with the M365 stack but also third-party APIs), a rich feature set, AI support, and Copilot Studio; this is a recipe for success. However, I can see the first wave of disappointment in Power Platform.”
For quite some time, Power Platform was promoted as a solution to every problem. Now, we can see it’s not a one-size-fits-all answer. While it excels at rapid application development, workflow automation, and data visualization for many scenarios, it does have its limitations.
“Don’t get me wrong – Power Platform is a great tool for personal/team productivity, but when it comes to enterprise-grade solutions, we may run into some problems. To handle proper scale, we have to go with a Premium license, which can be quite expensive at the enterprise level, so the running cost can be significantly higher than running WebApp with some SQL server (even on the cloud),” he explains.
On the other hand, Power Platform remains the best place for prototyping and delivering great value. It works great at the personal or team level. There is a lot of functionality available with standard license.
Marcin is optimistic about the future, saying, “I think we’ll soon figure out exactly where Power Platform fits best in the enterprise world. We’ll probably see some agreement on what kind of apps and how important they need to be to make Power Platform the right choice.”
As another busy conference season winds down, the pace of news about AI and Microsoft 365 shows no signs of slowing. Our team will continue monitoring the latest M365 developments and provide expert guidance. Contact us if you would like to discuss how to best leverage these advancements within your business.
* US and Canada, exceptions apply
Ready to innovate your business?
We are! Let’s kick-off our journey to success!