Microsoft Teams Insider

Is AI the new UC? Enterprise Connect thoughts with Kevin Kieller

April 10, 2024 Tom Arbuthnot
Is AI the new UC? Enterprise Connect thoughts with Kevin Kieller
Microsoft Teams Insider
More Info
Microsoft Teams Insider
Is AI the new UC? Enterprise Connect thoughts with Kevin Kieller
Apr 10, 2024
Tom Arbuthnot

Freshly back from Enterprise Connect, Kevin Kieller, Co Founder of enableUC and Microsoft MVP Tom Arbuthnot dive into the current state of AI in the UC and discuss its future impact on UC and how Microsoft Teams might play a central role for many organisations.

  • Thoughts on Enterprise Connect 2024
  • Understanding AI terminology
  • Importance of user adoption and training in driving the value of AI in UC
  • Role of IT Pros in AI landscape
  • Potential for AI to disrupt the Contact Center space

Thanks to Pure IP, this episode's sponsor, for your continued support of the Empowering.Cloud community.

Show Notes Transcript

Freshly back from Enterprise Connect, Kevin Kieller, Co Founder of enableUC and Microsoft MVP Tom Arbuthnot dive into the current state of AI in the UC and discuss its future impact on UC and how Microsoft Teams might play a central role for many organisations.

  • Thoughts on Enterprise Connect 2024
  • Understanding AI terminology
  • Importance of user adoption and training in driving the value of AI in UC
  • Role of IT Pros in AI landscape
  • Potential for AI to disrupt the Contact Center space

Thanks to Pure IP, this episode's sponsor, for your continued support of the Empowering.Cloud community.

Tom: Welcome back to the Teams Insider Podcast. This week we have Kevin Keeler, co founder of Enable UC. He's a good friend, he actually joined me for a session at Enterprise Connect. And that's what we're talking about, the kind of post Enterprise Connect thoughts. So AI was very much the theme of the show and we get into how AI is going to impact the industry.

Is AI the new UC? And then we talk about the role of IT pros in the AI landscape. The terminology, the context, and driving value through adoption and training. And finally, we get into the future of AI. Is the suite based approach the right approach, or is best of breed the right approach? Really hope you enjoy the show.

Thanks to Kevin for taking the time. And also many thanks to Pure IP, the sponsor of this show. Really appreciate their support of everything we're doing at Empowering Cloud. On with the show!

Tom: Hey everybody, welcome back to the pod. I've just jumped on with my good friend, Kevin Kieller, and we're going to do a bit of a enterprise connect thoughts we've it's about a week after the show as we record. So Kevin, I think everybody on the pod knows you,

you're a familiar face at Empowering Cloud, but you just want to give a little bit of intro.

Kevin: Sure. Thanks, it's nice to be here, Tom and Enterprise Connect. Yes, it's taken about a week to wind down from that hecticness. I'm the co founder and lead analyst at Enable UC. We mostly focus in the Microsoft ecosystem, helping large enterprises and vendors leverage the Microsoft ecosystem.

And then I wear another hat, I'm also lead the BC Strategies Team, which is a group of independent analysts, consultants, and thought leaders and so those are really the two places I spend the majority of my time.

Tom: Awesome. And you've spoken at Enterprise Connect for many years, and we were just having a pre conversation, I just said, let's stop and hit record

were having a conversation about where the show was and what the themes were this year. You had two talks, I think, and were they both around AI?

Was that right?

Kevin: They were. So as far as I can recall and looking back at my slides, this is my 13th year. It's, and it's always been for me, it's always been called Enterprise Connect. Before my time it was called VoiceCon, and I guess it was at places other than the Gaylord Palms, but for me, that in the 13 years, it's always the Gaylord Palms with the alligators and the indoor biosphere.

The first time I actually got outside was like Thursday afternoon after it ended, which, it's fine. You feel like you're outside, but you're in this giant bubble. But I love the Gaylord Palms because it's like coming back to a familiar home yeah, that was definitely good.

Clearly I feel every person that was there this year flagged the fact that, AI, Artificial Intelligence, was a key theme for the show and if you didn't have an AI story, that was a real problem but pretty much every vendor had that, I tallied up in 73 of the 171 sessions,

the session description included AI, and so not surprisingly, on Monday I did Teams plus Copilot, Microsoft's AI, and then Wednesday afternoon did a session that, it really wasn't a comparison session, although that's what people, because I put it in a chart, of course they compare where we looked at, the Zoom AI Companion, WebEx AI Assistant, Microsoft Copilot, and Google Gemini for workspaces.

A broader look at all the AI, evolving AI assistants, right? And of course, then I did an analytics piece in your great four hour deep dive into what team service owners need to do. 

Tom: Yeah. I appreciate you coming into my and doing a guest spot. I somehow I signed myself up for a four hour session and Kevin and a few others came and rescued me with some breaks. I appreciate that.

Kevin: Yeah, I keep telling people that that Tom did a good job. I think it's Huckleberry Finn where he enlists his friends to help paint a fence. I'm not a hundred percent sure. Somebody You know, correct me if I'm wrong, but I believe that was the story, and I think you did a good job because you got a lot of, a lot of friends and people who we've worked together to do parts of it because talking for four hours when you realized that was all on you, I think that 

Tom: Yeah, obviously the grand plan was to cover everything the team service owner needs to know so it was great, yeah, great to have your perspective on the analytics on that.

I feel like the obvious conversation is the AI conversation, but zooming out a little bit. Enterprise Connect was the PBX show for the whole industry, then it's the UC show for the whole industry. Is AI, as it moves from PBX to UC and we went through, Microsoft got in the game with OCS, Lync and Skype for Business. Is this another pivot? Is this the next communications big influence wave, do you think?

Kevin: Yeah, I think it is because I've had time to reflect on it and when I look back, Microsoft enters the real time communication space, right? With OCS, but really it's like OCS R2. And then if you look at, okay what's the high value scenario for a lot of organizations

the entire business case was made on reducing audio conferencing spending because you could use OCS R2 as your audio conferencing and as opposed paying by. Yeah, exactly. And then it was like, wow, this is working pretty well. And so now I'm going to, I had a PBX upgrade coming up, but maybe I'll just use this as my PBX.

So I think we've seen that through OCS and Lync and Skype for Business and into Teams. And with that, I think a lot of. Exchange experts and Active Directory server people, they were forced to learn a lot of things about real time communication. So they learned jitter and packet loss and round trip time and the tools.

and the training associated with that and I really do think, having had a few days to think about this, AI really introduces new terminology that, that IT pros, and really, anybody who wants to leverage this inside an organization needs to familiarize themselves with because the high value scenarios that are, being added to Microsoft 365, but specifically surfacing in Teams, because that's the hub where work gets done.

Are really around things like meeting summaries and document summaries, really that summarization piece. And, As part of that, I think IT pros need to understand large language models, like what is, what is a transformer model? What does it mean to be a probabilistic model?

What is, temperature of a model? What does grounding mean, right? Which is really about the context that gets fed in along with whatever your prompt is and then, for that matter, what's a Prompt and Prompt Engineering because Microsoft Copilot is fantastic, we've talked about the number of Copilots, but even the places that it's surfaced, even if you look inside of Teams, if you don't understand a little bit about how this works, and you don't have to become a data scientist, if you drive a car, you need to know it runs on gas or electricity and that you have to keep the tires inflated and change the oil occasionally.

So yeah, you don't have to be a mechanic, but you have to know some basic. And I think with leveraging Copilot, if that is your AI assistant, you really do need to understand what it's looking at when it's asked, answering your questions, your prompts, and what it's not looking at, because you could lead yourself astray.

For example, in meeting summaries today, it's only really looking at the transcript from the meeting, and so you do need to understand that, hey, if the speech to text didn't work exactly perfectly then the large language model, Copilot answering questions is going to have some flaws in it, right?

Or if you don't have,

Tom: it's

missing today as it stands, which is a

Kevin: it's missing chat and they've said we're going to,

Tom: Yeah, it's coming, isn't it? But like we make certainly in meetings I'm in IT led meetings, but often there's a sub conversation happening in the chat, decisions are made, bullets are made, that as it stands, that's missing.

I'm really looking forward to that coming in because often that's where I cement decisions, because I go back to the meeting and I look at the chat actually,

Kevin: No. And that, and the point about chat is, whereas with the transcript, two things have to go right. You have to first get the speech to text right, and then the large language model has to summarize it correctly. With chat, only one thing has to go right because, arguably the person typing the chat is going to type in, it's not going to have a speech to text issue.

Yeah hopefully, right? But I think it is important to know you know, in, in that meeting or whether it's a call summary which they also, the call recap they introduced or announced it's only based today on the transcript, and I think, whereas if you go into the Copilot chat inside of Teams, it is there looking at emails and chats and the graph, right?

So it has more context. So in those two different Copilot windows, you're going to get, 

Tom: Yeah.

Kevin: You need to understand enough to, put your prompt in the right place, but also understand both the opportunities and the shortcomings, depending on where you enter that.

Tom: Yeah, I think you make two really good points there. So the first one is that Teams thing for users is really important, is Microsoft have done a good job of landing the brand of Copilot, but there are many Copilots, but they're using the brand. And you're right, if you use what was business chatting teams, you're hitting the kind of Bing chat and internal data.

If you're using the Copilot in chat, you're hitting just the chat. If you hit the Copilot in meeting or the calls, it's just that, that isolated transcript. And it's quite possible, I think, even likely that a user is expecting the same experience because they're all Copilot. And this is this AI that's supposed to be able to do magic and do anything

and that's not the case in those scenarios. But the other one is, I hadn't really thought about this. You're right. The person who owns Teams in the organization. is almost de facto in the driving seat for the AI conversation because Teams is, as you say, the hub for teamwork. It's where the main Copilot interface surfaces,

it's where the best use case today is the meeting transcripts and it's backing off the data in SharePoint, so it's the front door of a lot of that conversation, isn't it?

Kevin: Yeah, absolutely. Like I would argue that today, although different vendors have announced like Zoom has Ask AI Companion or what have you, but today, really the only AI assistant. Assistant in any of these UCaaS platforms is only when you go into the Teams Copilot chat, because that actually is the only place where it, and it literally says, when you say if I say, Hey, help me prepare for this upcoming meeting with Tom,

it'll say, looking at emails, looking at chat, looking at previous meeting transcripts. That's where it's really doing what I think users expect it. and hope for it to do in all the contexts but today it's really only in that context that it's using all of that great information to do the contextual grounding.

Tom: Yeah, and that's where Microsoft should have an advantage because they already have your email, your OneNote, your meetings, that ability to scale them. I think the other UC platforms could do an absolutely excellent, maybe even better job of meeting transcription or chat because they have those. Microsoft has this advantage of having your data in the other silos and being able to bring it all together.

Kevin: Yeah, I do believe in many respects it's Microsoft's kind of battle to lose. Now, but, they're making it interesting because, of course, their licensing model is different, where they say, where Satya has talked a lot about democratizing technology and democratizing AI.

Now you hear Zoom throwing that back and saying, we think everybody should have an AI superpower and, which may be because they're including their AI Companion in any paid license. And then of course, what you talked about, there's lots of Copilots and there's slightly different

results even inside the Microsoft 365 Copilot across the different applications. So I think Microsoft potentially could confuse people enough that they don't get the, they're not able to leverage the value. And this is why I think IT pros, need to understand how they help people. There's lots of great user adoption and training materials that Microsoft announced around three weeks ago,

so they, you need to figure out, okay, how do you best use that? And of course, data sovereignty and compliance and who's seeing the data, these are important questions and different organizations are going to have different levels of requirements in this area, so def, this is where IT pros are also going to have to understand how does all of this work because it's, this is complicated stuff, like even meeting summaries, there's two transcripts, and one is stored one place, and one is stored another and the meeting summaries that are generated by AI today, you can't edit them, so you need to help users understand if you go back and look at a meeting that we had a couple weeks ago maybe that is, has some incorrect summary information there, so how do I deal with that, right?

Microsoft

So lots of new learnings.

Tom: Definitely, that's a really interesting one is Microsoft's kind of positioned Copilot as your assistant, so the theory is, you generate the summary, you get it, you understand you generated it, it's imperfect which works okay as a framework, but as soon as you're into that model where you're allowed to skip the meeting and you can go

and check the summary out then you're relying 100 percent on the generated summary and you don't have the context because you weren't in the meeting.

So to your point, it starts to get interesting as to how do users understand that's an approximation of what happened, it is imperfect. You, how much are you allowing yourself to rely on it.

Kevin: Yeah, and I think the exciting thing, I think, for all of us is, it's magic often. It is it does a phenomenal job and it works super well, except when it doesn't. And this is the problem is that, I think, because it often does such a good job summarising and it's just it's magical.

There's some meeting summaries where it's just wait a second, this is, this got to be some people in the background, right? It's like somebody who just types really fast. I think there was some April Fool's jokes around that some of the AI people said no, it's not really a large language model

it's just, we have people that type really fast. But, because it often gets it right, I think it's how do you identify the time that it doesn't get it right? And I'm pretty confident that Microsoft will, right now they describe all flaws as features. So they're saying, this is the AI generated summary and you can do some loop component notes

and those are the human notes and they're trying to say, it's by design that you can't edit the AI summaries, but every other vendor lets you do that. So I just feel like eventually they're going to say, hey, yeah, if there's a big mistake, somebody needs to correct it, and ideally, it keeps track of who the person is that corrected it so that you can say, oh yeah, like Kevin, whatever,

kevin edited these notes, so effectively, now I'm responsible, just if I sent out a, meeting minutes, and if they came from me, I'm on the hook because I'm the person who summarized it, right? But this is it's exciting times and I think, these summary use cases are going to be high value.

It's going to take us a while to figure out how much, what value they derive. I think we saw Nicole Herskowitz, share some initial stats from some of the early adopters of Copilot, but a lot of those were more qualitative, right? Like I believe, like they did survey. I believe it improved

Tom: Yeah. I don't want

away from me. It's if you give people anything, they'd rather not give it back.

Kevin: Right, why give it back? It's easier to say, yeah, of course I'll keep it. And I think, over the next 6 to 12  months, we're going to get some quantitative data, and some of the new Copilot dashboard analytics reports, they help you, they, so if you deploy a license, so spend the 360 dollars, the 30 times 12 because it's an annual commit,

you can go look and say wait a second, that Kevin guy never did a meeting summary. Either, either he doesn't know how to do it, or maybe he doesn't, going forward, I'm going to take his license and assign it to somebody else in the organization.

Tom: I also think you touched on it earlier. There's a massive kind of learning curve on the AI stuff for the end user where clicking a button to get a meeting summary is fairly easy, right? There's value there for the user. It's probably good enough most of the time. Doing the stuff, the more advanced stuff around prompts, suddenly you have to have this Pretty high level, or low level to make sure you look at it, understanding of how to write a prompt, being very specific about what you want, why you want it, what you don't want. So jumping from summaries to driving Copilot to get value out of it, I feel that's quite a steep learning curve. There's a lot of work to do, and after you've just spent your 360 dollars per user per year, which I think is a good way to frame it rather than saying 30 per month. I don't think organizations are necessarily as ready as they should be to spend possibly the same again on adoption and training to drive the value of that, because they've already spent what they think is a big number but they're going to have to drive the value.

Kevin: Yeah, and I think if you don't provide adoption and training services, you're really not going to get the return on investment, right? For most people. And, Microsoft's trying, like they've got a button that says Copilot Labs, and it gives you some ideas of prompts, but every, for sure every week, if not every day, I talk to people and they will use Copilot in a way that I never thought about, and I learned from that.

So I talked to a senior leader of a large IT group and she was saying, she often, like many of us, has overlapping meetings, and so one of the things that I thought she did was brilliant was, she knows on her large team that there's some people that have a tendency to get into debates or conflict, sometimes I call those spirited discussions,

so she'll go into a meeting transcript afterwards and she said, she'll say, was there any conflict in the meeting? And if the answer is no, then she moves on. If the answer is yes, then she'll say, then she can say, hey, Copilot, create a chart, create a table for me with the, the contentious items and what were the different, the pros and cons for the different points, and then she'll look at it to see if she has to go spend more time and help, resolve particular issues.

But I love that idea of just at the high level saying, hey, was there any conflict? Okay, if not, then let's just move on and, she didn't have to spend her time diving deeper. I would have never thought about that, 

Tom: No, that's you're right. That's a great use 

Kevin: It's like brilliant.

Tom: Yeah, you're right, and we're still, as you say there is the Copilot Labs, but I feel like there's not a super accessible way to say, based on your job role, your use cases, here's some prompts and it's a lot of, it's going to be generic, but a bunch of it will be industry or company specific as well.

It'd be a great way to have an internal library of here's how I've got value out of Copilot as well.

Kevin: Yeah, and I think that is, and I, and hopefully some of the Microsoft user adoption and training, the good thing is they put it out and they always put it in like editable format, right? So you can add your own and really organizations that are big enough to have a communication and training department where they can sprinkle in some of their own kind of suggestions for their organization.

Once they Hopefully experiment, right? So 2024, I think is going to be a lot of Copilot experimentation. We'll get some data, we'll figure out use cases for our organizations and then, which will be good because then at some point, it's like you're going to have to deal with this business case of if I've got 10, 000 or 100, 000 people, how many Copilot licenses could some be served by just Teams Premium and

the Intelligent Meeting Recap, which you can't query, but you get some of the value, right?

So I think, the people making the business cases, this is. It would be a big spend if you've got a significant base of users.

Tom: Definitely. So let's say, I think we're both saying that AI will be pretty impactful. Let's fast forward. You've got the actually you've got the AI show coming from the Enterprise Connect. Is it Enterprise AI or AI

Kevin: I believe they're calling it Enterprise Connect AI, and my understanding, there's not a lot of details yet, but it's October 1st and 2nd in Santa Clara, that's, and that's the sum total of what I know about that 

Tom: Yeah, because I'm curious what the definition of that show is because if everything's AI, what, what does that, I guess it's zeroes in on the use cases potentially, but Enterprise Connect this time next year, I feel like in some ways, Microsoft Keynote this year was similar to last year, except this year it's last year

it was showing you what's coming this year it was, it's here, that can't be the same Keynote again in 12 months, so how do you see the UC and AI story changing over the next 12 months? Or do you?

Kevin: I think an interesting thing that I predicted last year was that AI is going to move us more back to this discussion. There have been discussion around is it a suite or is best to breed better? Now AI drives you to the suite, right? Because as we've talked about your AI assistant, companion, friend, whatever it's called the more data, the broader base of data, it can do a better job

and so I think that what we're going to see a year from now, is that continued push to the suite and we're seeing that from Zoom and WebEx and of course Microsoft has always had the bundle, although they're arguably now in their de bundling, the license mode to maybe maximize revenue.

But I think we're going to have not the qualitative data, but we'll have the quantitative data. And I do think to vendors, I'm hopeful that the vendors are not going to say my AI summaries are better than yours, and there was a little bit of this because I don't know exactly how you test that in a, a probabilistic model,

what that means is when you train these large language models, the same input, unlike a computer program, the same input does not give you the same output all time. So it's like, how exactly do test and then, lock down all the variables like the speech to text, which is impacted by gender and accents and the device in the room and all of those kind of things.

I'd rather see a focus on, okay, here's some quantitative data on how people are using it and here's the use cases that are driving some real value in the organization.

Tom: Yeah, I like that idea because if people are using it, they're inherently getting value, right? People won't persistently use something that doesn't drive value. So that's a good measure in a world that you rightly say the AI could be behaving differently, or they could be doing drastically different things with it that we hadn't thought of. If they are constantly coming back to it, they're clearly deriving value out of it. That's a good perspective. 

Kevin: And 

Tom: This is our ask for next year, is it? For the, from the vendors and

Kevin: Yeah, I think quantitative, yeah, quantitative come with use cases, right? Now, of course, and just real briefly, the Enterprise Connect has really also had a lot of focus on the contact center, right? And the reason is, quite frankly, because there you do have all the quantitative metrics, right?

So of course AI surfacing in the contact center because those contact center agents, every metric is already tracked, and so I think that is a good early indicator now, but some of the use cases like, percentage, first call resolution, average handle time, those don't exactly map to the knowledge worker, but if you're smart, you can look at that and say, okay how, which one of these

kind of quantitative data can I map to my UC

knowledge worker? And I think if you do that and look at that, you can be ahead of the curve in terms of focusing on use cases that are likely to drive value on the UC side of the house.

Tom: Yeah. We should do a separate conversation on the CC side of the house, because I think that's really interesting for AI, that's where AI can have hugely positive and measurable impacts and will disrupt very heavily.

Kevin: Absolutely. I share that belief, but that would be a good discussion for another time.

Tom: That's the next show then. Next time we have you on Kevin, we'll get into that, but thanks for your summary and I really enjoyed your sessions on on ai, particularly the the comparison. I think that's not, there's not many people that sit, have to have that cross, cross vendor perspective, so that was really good.

Kevin: Thanks. It was good being on the show and I look forward to meeting up in person whenever our paths cross again.

Tom: Awesome. Thanks, Kevin.