Steve has a Fifth Annual Chat with Charles Lamanna

Images courtesy of Microsoft Designer

I had originally planned to pounce on Charles on the Friday following Ingite, but then OpenAI blew up for a week. So, I held off until a couple of days ago :).

Charles Lamanna is the Corporate Vice President, Business Apps & Platforms at Microsoft. He is the Top-Dog for everything Dynamics 365, Power Platform, and Dataverse related, including Copliots.

We chatted for almost an hour about many things, including the future of Copilots and UIs in general, and much more. Give it a listen, or read the transcript below if you are one of “those” people.

Here’s the transcript:

Charles Lamanna:
Hello, this is Charles.

Steve Mordue:
Hey, Charles. Steve Mordue. How are you?

Charles Lamanna:
Good, good. Great to hear from you, Steve.

Steve Mordue:
It’s been over a year. I’ve let you alone with all this AI copilots and all this stuff that exploded everywhere.

I thought, “Nah, I should wait until things calm down a little bit.” They haven’t calmed down, but I got tired of waiting, so here I am on the phone.

Charles Lamanna:
Sounds good. We just got through Ignite a couple of weeks ago, which was a great event for us. We had a big push with, of course, AI features, and Copilot features galore. Looking forward to catching up.

Steve Mordue:
Yeah. Like 100 announcements were made at that thing. It was just crazy. It’s unfortunate the OpenAI saga stole some of the limelight for a period of time right after Ignite, when normally that next week is just full of people talking about nothing.

Now, we had a week delay, but now that conversation is ramping back up again, and you guys definitely had an awful lot of stuff to show there. Can I tell you what my favorite thing was that I saw?

Charles Lamanna:
What was it?

Steve Mordue:
Project Sophia.

Charles Lamanna:
Oh, love it.

Steve Mordue:
That looks so cool to me. Maybe you could tell me, just out of, I don’t care about the other people listening, that’s just for my sake. Tell me a little bit about that because I’ve been thinking since you guys started down this path, that the concept of a UI as we have known it for decades, was going to change into something else.

I couldn’t put my head around what it would be, but it was definitely not going to be the same. Some form that we’re filling in fields and hitting enter, something was going to change. The Project Sophia is designed AI first, as opposed to AI being added to some, let’s call them, legacy applications, because that’s what they all are is an AI first. What are your thoughts on that as a direction?

Charles Lamanna:
I’m also very excited by Project Sophia, and one of the things we wanted to do was to really take a look at the problems and challenges that users and customers have today, and reimagine them in an AI future. If we go look at Project Sophia, it’s very much focused on helping people explore data, understand data, analyze information and make decisions.

These are focused on the hard decisions that you make in business, not the easy ones, like generating email or generating a document. These are the big questions. How should I price a product? How should I plan my territories? How should I manage my supply chain? How can I improve employee satisfaction? The big, hairy challenges. We took a step back and said, “What would an AI-first approach look like for this type of challenge?

“What would it look like to build a solution that was born in the era of GPT-4? Born in the era of foundation models that can easily understand images and charts, and even dynamically execute code to go create insights?” Sophia was built with that in mind. We reimagined the user experience. We reimagined how data is formatted and ingested, and we reimagined how information is presented and explored by a user.

What we realized, as we went on that journey with Sophia, is it’s not going to be only chat either. Because why just use chat if I can also use a keyboard and I can also use images, and I can also use a mouse to click and point, and drag and drop? Instead of having to have a threaded conversation, what if I could have a visualization of insights and decisions that connect together? It’s the same reason why we all love BI tools, for example.

Who likes looking at raw data as opposed to a bar chart and a line chart? That’s what Sophia is about. Makes it easy to research and solve hard business problems and it does it in an AI-first way. I think Sophia is the first of many examples where we’re going to go back to the drawing board, and reimagine and rethink common experiences and workflows at work based on this AI revolution.

Steve Mordue:
Yeah. It definitely is an experiment in seeing, “Okay, let’s take a crack at an AI-first and see what we can do, a learning experience to see how that comes together.”

But I can definitely imagine that in the not-too-distant future, Copilot’s being bolted onto Dynamics 365 sales will be an old way of doing it. That whole experience will end up getting reengineered. Is that part of the thinking, or at least in the back of your mind, where this thing goes?

Charles Lamanna:
Absolutely, absolutely. The historian in us, computer science has amazing stories of complete transformation and reimagination of the user experience. If you take mobile devices and smartphones, the successful apps on smartphones are not PC-based applications built for mouse and keyboard, shrunk down and with bigger buttons to work on touchscreens on a small screen.

Instead, the most successful apps are those that took a step back and said, “What’s possible now on a mobile device, which was never possible on a PC? Let’s build the best possible user experience for that.” If you take Uber, like Uber would never make sense as an application on a PC, because I’m not on the go. I don’t have a GPS, you don’t know where I am.

You can’t match-make me, but it makes perfect sense on a mobile device because mobile devices have GPS. I have internet on the go, I’m able to go run full-fledged applications standing on the side of the street in New York City to hail an Uber. That type of thinking needs to happen for these AI-based applications as well.

Yes, between say 2002 and 2012, there were a lot of PC or Windows-based applications that were shrunken down and made to fit on a mobile device. But by 2012, those stopped being the main user experiences and you had mobile-first experiences. The same thing’s going to happen with AI.

Steve Mordue:
It’s interesting. There’s a lot of legacy desktop applications that today, I think, work even better on mobile. LinkedIn is a good example.

I prefer the mobile experience to their desktop experience. The team there, are you guys related to their team or are they completely separate teams and folks?

Charles Lamanna:
We’re all part of Microsoft. That’s a different group from the Dynamics and Power Platform teams though.

Steve Mordue:
Yeah, yeah. They’re building their own things, but you guys get to steal from each other’s playbooks I would imagine, right?

Charles Lamanna:
Yes.

Steve Mordue:
If there’s somebody that comes up with a good idea, it gets shared around?

Charles Lamanna:
Yep. That’s one of the things which is incredible about being part of Microsoft during this AI revolution, is the fact that it’s really a whole company copilot and AI stack. You have at the lowest level, GPU infrastructure in Azure models, which are being built in partnership with OpenAI. That’s the same infrastructure and model used by everybody at Microsoft.

We can share in the benefits and also share in the cost to get that stuff built out. But that goes up to the next highest level with things like Azure, OpenAI and Cognitive Services, or Azure Machine Learning or Azure AI Studio, or Azure AI Search. We all get to use that. Then you have the mid-tier where things like Copilot Studio and common reusable UI components between Office and Dynamics.

All of that layers up in a way where it can draw from code, from infrastructure, and even just learnings across the entire company.

Steve Mordue:
When the AI race really broke out, I think, was when you guys launched the Bing Copilot. I think that was when everybody became aware that you guys were playing in the sandbox as well, beyond the ML and some of the legacy stuff, but that you were in the chat space as well. Sent Google reeling, I think, for a little bit. We have yet to really see fully what they’re going to come back with, although they’re doing something.

We got Salesforce out there with their Einstein doing their thing. But both of those companies have a little bit of a disadvantage, in that they’re point solutions for the most part. Salesforce is a business application, which you guys have, but that’s all they have. Google is mainly search. They’ve got some productivity stuff that’s out there and some email stuff that’s out there.

But you guys are in a unique spot that you’ve got, I don’t know of a base you don’t have covered. You guys are producing everything across the stack that anybody would need. That just feels to me like a huge advantage going into this next cycle of AI.

Charles Lamanna:
Yeah. I think the comprehensive view of AI at Microsoft, is going to be super critical to making the AI as effective as possible. One of the things that we did at Ignite, was really focus on this idea of a single Microsoft Copilot. We want to make sure that that copilot is accessible to everybody and anybody in both their personal and professional life.

That copilot is able to have understanding and knowledge of past engagements and interactions, but also has all the skills it needs to do things like write emails, create a sales report, work a customer service case, do financial planning. If you name it, it’s one copilot that can do all of it. We talk about this idea of we want to make there be a copilot for everyone in the world, much like we wanted to used to have a personal computer for everyone in the world.

It’s that same idea. This was an important evolution of the approach to bring all of that together, like you mentioned, because what we heard from customers loud and clear is they don’t want an army of copilots. They don’t want 25 different copilots that they have to organize and manage to get the job done. They want a single copilot, which is able to do everything they need at work and at home.

That’s why we, for example, renamed Sales Copilots to Microsoft Copilot for Sales and why we made the Bing Chat also available at Copilot.Microsoft.com. It’s all about this one copilot that can do everything for you in one place.

Steve Mordue:
That seemed like the next, I was a little critical when these things first launched. I don’t know if you recall, because it seemed like in the arms race of AI, the first pass was let’s slap a copilot on everything we have quickly, just as quickly as we can. They don’t have to be particularly valuable. We’ll circle back and do another pass once we’ve checked the feature box across our stack that everything has a copilot, and now we’re well beyond that.

Now we’re deep into the, “Okay, now let’s take this copilot that we slapped on here, and turn it into an actual copilot of value.” It’s been pretty impressive to see what’s happened with some of that stuff. But then I was starting to get inundated with, there’s a copilot for this and a copilot for that. I’m thinking, like you said, 25 copilots to manage.

At what point does this just become a copilot that follows me from my productivity role, follows me into my sales app for my sales functions, follows me into the service app, follows me wherever I’m going? It’s the same copilot, maybe even keeping context of what we’ve been doing through this little journey across a few different apps. Is that where we’re going to get?

Charles Lamanna:
Exactly. That’s absolutely where we’re heading towards. When you get there, the whole interaction pattern changes. Like you mentioned earlier, instead of going to a form and filling out information and there’s a copilot on the side. Instead, there’s a copilot that I can go to and do all the tasks I want to, and it has the full context of what I’ve done. We call that this idea of memory.

If you think about just like a salesperson, if I have an engagement with a customer, I want the copilot to have the full knowledge and history of all of our past engagements. Whether that’s emails, Teams messages, documents I’ve shared with them and more, because that’s what I have in my mind. It can be an augmented, an extension of how I approach the customer if it has that same context.

Definitely that’s the direction we’re going. I would say I don’t think we shipped a bunch of copilots just to check boxes. We did it really to learn. It’s interesting. I don’t think I’ve ever been part of a team in an area where we’ve been so far ahead of the broader market. When we launched Copilots, there was nothing like it anywhere.

Steve Mordue:
A rare spot for you guys to have been in.

Charles Lamanna:
Yeah. What that means is we have to write the book. That’s what I tell my team. We are running the experiments and we are writing the book on what AI applications look like. We are writing the book about what Copilot means. No one knows today what the final shape will be. The only way that we actually can figure out the best possible way for presenting AI and copilots, is by shipping early, shipping often, and listening and learning from customers as they adopt these AI capabilities.

My goodness, have we learned a lot. We learned a lot technologically. We’ve even more importantly learned a lot from a user experience point of view and a business value point of view. We now have some real amazing points on the board and wins on the board of business impact using Copilot. We did a great case study we published back in September about how we use Copilot for technical support at Microsoft. That was massive cost savings and massive improved customer experience.

Things like 12% to 16% improved throughput of cases per agent from a copilot. That’s game-changing. 15% less time spent on the phone, 15% less time spent getting an answer and resolution for a support case. That’s game-changing for our customers. That’s a V1 nine months into the rollout. Imagine what V2, V3, V4 will be able to do.

Steve Mordue:
It’s amazing position that you’re in to be able to actually have this enormous, real-life test bed of your support organization to experiment on, if you will.

To figure out what it does and how it can do better internally, instead of just speculating on how a customer user might use this stuff and build to that.

You built it to solve your own problem, which is going to be the same problem everybody else has just on a larger scale. Very interesting.

Charles Lamanna:
Absolutely.

Steve Mordue:
I’m curious, what are your thoughts on, now we’ll go off into La-La Land a little bit on what would AGI bring to this equation in your imagination that’s beyond? Because right now we’ve got our datasets and it doesn’t necessarily learn or speculate.

It’s guessing the next best word, which it does very well and that sort of stuff. That alone today, is adding lots of value to lots of the jobs that we do. But what do you see happening when and if AGI ever comes about? How does that change the thinking?

Charles Lamanna:
Yeah. I would say AGI is probably not going to be a step-change, but much more a gradual progression towards the capability. I say that because along that progression, there will be a whole lot of innovation and value created, even before we take a step back and say, “We’ve gotten there; we’ve gotten to AGI.” If we go look at already what’s happening today, 10%, 15%, 20% productivity benefit, that is absolutely game-changing.

That’s at the same benefit of PC and internet combined. That’s like V1’s, like I said, say GPT-4. Imagine a few more versions of that. I think we’ll see each year for the rest of this decade, pretty remarkable improvements in the capability of these models. Things like better planning, things like better knowledge, things like better action taking, because that’s one of the big things that we’re working through right now to improve the effectiveness.

Having the agent be able to go execute plugins reliably and efficiently. All of that progression will happen. At some point, we’ll take a step back and say, “You know what? This is probably what we would call AGI,” but I don’t think it’s going to be a single breakthrough moment. I think it’s going to be more like steady progress. A progress that’s born of amazing scientific improvements, new algorithms, new approaches, amazing compute.

Just an incredibly large amount of processing power to do all these things to train and infer. The third bit is also just more and more data, both synthetic and real-world, to continue to inform these models. I think those three things will work to go continue to drive that steady improvement. It’s, I think, only a matter of time until AGI, like something that we really truly call AGI exists. I think we’re all just guessing to when it will actually happen.
But the big takeaway, I would just say, is even if it doesn’t happen, my goodness is there are an incredible amount of innovation and benefit coming before then.

Steve Mordue:
Well, right here at V1. Even at V1 there is.

Charles Lamanna:
Yes, exactly.

Steve Mordue:
You guys again are uniquely positioned, because one of the huge factors of this whole journey is going to be compute. I don’t think there’s anybody on the planet better positioned for that to be able to build and grow that compute capacity that’s going to be needed. I don’t see how anybody starts today. Even anybody that’s already in the field starts today and catches up with where that is. It’s going to be an interesting time.

That is for sure. What’s the term, Moore’s law? We’re in that phenomenon now. You talk about everything that’s been accomplished and everybody, they’re hearing AI, AI, AI all the time, but this started a year ago. This was a year ago. I’ve never seen, as many years as I’ve been working with you guys, so much stuff happen so quickly in such a short amount of time. You guys aren’t going to be able to slow down.

The wolves are circling out there looking for an opportunity. You have to stay ahead. You’ve partnered with probably the best opportunity with OpenAI to stay ahead. You’ve got all the ingredients necessary to stay ahead, but you have to stay ahead. This thing is we thought we were on a dizzying ride with the Power Platform launch and all the changes. I don’t think we’ve seen anything.

Charles Lamanna:
Yeah. I would say we joke in our team is we had the old pace and there’s the AI pace, which is it’s a different level of acceleration and velocity than we’ve had in the past, and it’s out of necessity, but also opportunity. To stay competitive, you’re going to have to take advantage of these amazing AI capabilities. That’s us internally using GitHub Copilot, internally using Copilot, and M365 using Copilot for our sales and service functions inside of Microsoft, the company.

Also, finding ways to make it so our customers can take advantage of the same AI capabilities as fast as possible. But it’s also something that’s been worked towards for a couple of years, and let me expand on that in two important dimensions. The first, if you want to say when did really this pace first take shape? It’s when Satya really pushed this idea of One Microsoft.

Started to change the culture to be one that’s focused on a growth mindset, being open-minded and learning, as opposed to feeling as though we know everything. This mindset has made it possible to have a culture where this type of innovation can bloom so rapidly. Where we could do a partnership with a company like OpenAI, where we could have a single copilot stack for the entire company. That really came from that cultural shift.

That’s item number one. Item number two is I wish I could say that we did everything in the last year. We did have a little bit of a headstart. We did the partnership with OpenAI first in 2019. We shipped the first Copilot features in Power apps back in 2021, and the first one in Dynamics 365 in 2022. It really took off in earnest over the last, I’d say, 15 months or so, but we’ve been dabbling with it. That was on our first feature shipped on GPT-3. 3.5 didn’t even exist. I remember in the summer of 2022, a small group of us from Microsoft got a demo of GPT-4 and what would ultimately become ChatGPT. We didn’t think it would be as big as it became that November, but we knew it was something special. Summer of 2022 is when the starting gun really went off for us at Microsoft to start going as fast as possible. I don’t see any signs of it slowing down.

What I would say is we’ll have 100 more Copilot features just in Dynamics and Power Platform and things like Sophia over the next six months, at least 100 just in business applications.

Steve Mordue:
Yeah. I think what happened was it became immediately understandable and accessible to everybody, which prior to that, there’s this AI stuff Microsoft has. We’re talking to customers about it. They’re saying, “That sounds really cool,” but they really don’t have any idea what you’re talking about. You’re showing them some demos and some stuff and they’re not sure how to get from here.

They couldn’t get a clear picture in their mind of what this is actually going to do for my business, until suddenly they saw ChatGPT and that just crystallized everything instantly. You guys were pushing hard. I know earlier before this for AI, AI, AI, AI is the future, we had tons of AI capability and not enough uptake until the light bulb went off on everybody all at once.

I can imagine that the all hands on deck attitude that that had to have created internally. Thanks OpenAI for turning that light bulb on for us. We’ve been waiting.

Charles Lamanna:
Absolutely. I remember early 2022, I would talk to customers and talk about this thing called GPT. They’d be like, “GP what?” I would ask questions, “Who in this room has heard of GPT or GPT-3?” It would be like one in 50 people would raise their hand.

Of course, as of December last year, basically a year ago, you get asked the same question, “Who’s heard of GPT or ChatGPT?” It’d be 100% of people raise their hand, so you’re exactly right.

Steve Mordue:
In only a couple of months time, it went from one to 100% really quickly.

Charles Lamanna:
Yes.

Steve Mordue:
That sudden realization by the world, was probably one of the fastest things that has ever become so sudden.

Charles Lamanna:
Yes, yep.

Steve Mordue:
Speaking of ChatGPT, and I want to make sure that people that are looking at Microsoft, understand these copilots are not just ChatGPT iframed into Microsoft properties.

Because ChatGPT, while it’s a great tool, it’s known to be a little odd and unreliable and it maybe hallucinates, but that’s not exactly what you guys are presenting in these copilots.

There’s a lot more happening. Maybe you can unpack a little of that architecture so people can understand the difference.

Charles Lamanna:
Absolutely. We have something we call the Copilot system or the Copilot stack, which is how we’re able to take the raw power of large language models like GPT-4 or like the ChatGPT Turbo or the instruct models. Take that raw power and harness it to answer business questions and help improve experiences in our applications, or even go do things like data exploration in Project Sophia.

The way those systems work, is we first use something called retrieval-augmentation generation or RAG, which is the longest name but a great acronym. But that allows us to dynamically inject prompts, meta prompts and additional context by reading from say, a vector database or a search index, or a relational store or just other information we want to inject, to guide the large language model’s response.

If you use Copilot inside of Dynamics 365, say you use that chat experience that we have in the Dynamics apps. There’s actually thousands of tokens that you don’t see, that we inject in every single message to the large language model. All of that is how it can know information about your data and your information, and to do that in a secure and compliant way because we don’t put that data and information in the model itself.

Then when the response comes back from the large language model, we do things like ranking, filtering in responsible AI filters, to make sure that the answer is the best possible answer we can get from the model. If you think about that large language model at the center, really it’s a little bit of an opaque box. We don’t understand exactly the inner workings, but we can completely control the inputs and outputs to shape and corral it to behave the way we want.
That’s the most common pattern we use pre-processing and post-processing. We use things like retrieval-augmentation generation and rankers on responses to make it be really aware and understand the business problems we’re trying to tackle. A consequence of that, as you mentioned, is reduced hallucination. Reduces I’d say how much it draws from the broad internet knowledge, and instead biases towards information specific to the customer in the scenario in the workflow.

That’s the most common use case. We also do some fine-tuning and we use something called LoRA or Low-Rank Adaptation, to further customize the model itself to our specific use cases. We have found great success using this in certain scenarios. The best example being inside of the Copilot for Power Automate, where we have a specific version of a GPT-based model that is amazing at generating Power Automate workflow definitions.

You give it natural language instructions and it will generate the YAML or JSON to describe a Power Automate workflow. If you use that natural language to create a cloud flow or the authoring experience with Copilot and the new Power Automate designer, you’re using that special model. That’s something that we’re really proud of, because we shipped the first version of that 13, 14 months ago even before ChatGPT.

We’ve been able to, based on that unique OpenAI partnership, really evolve the model to be a super hyper-tuned version of GPT specific to Power Automate. Those are probably the two big patterns, and there is a lot of work that goes on top of the model to make it actually behave the way we want it to.

Steve Mordue:
I would have to think that probably has a lot of similarities to the GitHub Copilot.

Charles Lamanna:
Yes, exactly.

Steve Mordue:
Because that’s a fairly specific use case that needs specific information, and I’ve been amazed at how well that works.

But you can tell the difference when you jump into your sales app and you ask the sales Copilot, “How many sales does it look like I might close next month?”

It goes and churns and comes back with an answer. If you type that in ChatGPT, you’re not going to get it. Well, you’ll get an answer.

Charles Lamanna:
Exactly.

Steve Mordue:
You actually will get an answer, but it hasn’t been augmented or ranked or anything. I think it’s important for people to understand that copilots are not ChatGPT. They’re not the same thing. There’s a completely different experience inside of a copilot.

But that also makes it I would have to think even a bigger challenge to, “Okay, now let’s reduce all these copilots to one that follows me around.” Because like you say, you’ve tuned this for that, you’ve tuned this for sales, maybe this for service, this for office stuff. To be able to follow me around, has to be a pretty big challenge.

Charles Lamanna:
Oh, yes. It’s not too dissimilar from what we all do as people. I’m one person, but if I’m talking to one of my direct reports to do a performance review, I talk to them differently than if I talk to a customer in pre-sales, than if I talk to a customer in post-sales, than if I talk to an architect doing a post-mortem review of a live site incident.

It’s the same person, the same knowledge, the same worldview, but the language I use, the questions I ask and the information I focus on, can all be different. If we look to the copilot example, you would want a copilot to Steve and I’d want a copilot to Charles. I would want that copilot to be able to adopt and change its behavior based on what I’m trying to accomplish.

I think what that means, is it a one model or is it multiple models with the ability to orchestrate between them? I think we’re all figuring out what exactly that answer is going to be. But it should manifest as a single copilot with a single memory in a single history view, and access to all the same information that you do.

Steve Mordue:
Like a context aware copilot.

Charles Lamanna:
Yeah, exactly. Yes.

Steve Mordue:
I think the Copilot you’re talking about, that’s the one that’s in preview now with larger customers, like minimum 300 seats?

Charles Lamanna:
Yeah, the Microsoft Copilot. Yep.

Steve Mordue:
Yep. Obviously, people are out there whining because they’re a little five-user organization can’t get ahold of this thing.

But I have to assume that you have to start somewhere, so you set some sort of a minimum so you can work out the challenges, but eventually this will be available to any size company, right?

Charles Lamanna:
Absolutely, absolutely. Yeah. I would say that that’s just a timing and sequencing decision, not at all a reflection of the technology or the ability.

It’s just there’s only so many GPUs and so many engagements that we can support at once, and we’re just saying we’re going to focus on enterprise first.

Steve Mordue:
Not being a hardware guy, I heard Satya announcing the development of these new GPUs, I guess. Has Microsoft done that before where they’ve built their own GPU?

Charles Lamanna:
I think this was our first one. Certainly is I think the most compelling one that we’ve ever done.

Steve Mordue:
Yeah. Well, I would have to assume that there’s a supply and demand problem right now out there in that space.

One way to solve that is to add your own, create your own, and then you’re not as dependent on the supply and demand that’s out there.

Charles Lamanna:
Yeah. I would say that I think we’ll build our own, and of course, we have a fantastic partnership with Nvidia, which doesn’t change. In the same session where we announced the GPU, I think Jensen from Nvidia, their CEO, was on stage with Satya talking about it. I think our worldview is we want to support customer choice throughout.

We want to draw from the best of the best, because there’s a real correlation between the effectiveness of the AI models and the underlying compute capacity. We need bigger, better GPUs to build bigger, better models. We really care about continued innovation there. Customers are always asking for more integrated, more end-to-end solutions in Azure.

Same reason we also announced the ARM processor. We’ve already done a whole bunch of amazing work to say improve things like network switches and network interfaces inside of the data center. We’re going to keep doing all of those things, but we’ll never ever waiver on our commitment to customer choice, and never ever waver on our desire to push the frontier and innovate of what’s possible in the GPU. More memory, more computations, higher velocity, all of that.

Steve Mordue:
Well, and another unique advantage is you guys have all the levers to adjust and play with. You’re not dependent on other folks for a lot of those levers. You have them all at your disposal.

Charles Lamanna:
Yeah.

Steve Mordue:
There’s been some changes in the organization. You’ve got some new people that have come on, people have changed some roles, people have moved on.

I know you guys do that all the time, but has there been any impact from some of those changes? I know Sangya got a new role.

Charles Lamanna:
Yeah. At a high level, I would say technology organizations are never static, and that means rebalancing investments, rebalancing teams. The AI transformation and the AI speed has been no different when it comes to organization shaping and evolving. We’ve definitely done a bunch of shifts and evolutions to dial up our investment in areas around AI.

That’s one piece. Second piece is, I think, a healthy organization is one which does have some churn. I’d say some of my proudest moments as a manager, are when someone on the team goes on to big, awesome roles as a result of success that they had on the team. That’s something that happens from time to time.

Steve Mordue:
Also getting new faces in with new, fresh ideas too. That’s a give and take there.

Charles Lamanna:
Yeah, I was a new guy to biz apps once upon a time. You remember that, CDS 2.0. I think for anyone that’s been around remembers the CDS 1.0 and CDS 2.0. You get bonus points for being a long-time listener, but it’s good. It’s good.
It’s evolution. I feel the team, I am beyond fortunate to have the leadership team that I do. Incredible leaders like Sangya, Ryan Cunningham, Laurie Lampkin, Jeff Comstock, Lily Chen, Gay [inaudible 00:43:18].

Just incredible, incredible folks, which take Dynamics and Power Platform and Copilot to the next level.

Steve Mordue:
Who’s the PM on the business apps, copilots all app? Is there somebody in particular responsible for that?

Charles Lamanna:
I think my view is Copilot is almost like a web-based UI. Every single application and workload needs to have a Copilot strategy. That’s one part of it. Basically, every team has PMs working on Copilot capabilities and features.

We build on two major, core shared pieces of technology. The first is Copilot Studio. Copilot Studio is the way you build copilots, just like you use Power apps to build Dynamics. You use Copilot Studio to build their copilots.

Steve Mordue:
That’s not a rebrand, just a rebrand of the virtual builder?

Charles Lamanna:
No, it’s a lot more than Power Virtual Agent is.

Steve Mordue:
It’s consumed all? It’s consumed all of that as well?

Charles Lamanna:
What used to be the Power Virtual Agent feature set, is now a subset of the features in Copilot Studio. Copilot Studio is how we build a lot. The analogy I make is Power apps and model-driven apps are to Dynamics 365, as Copilot Studio and the copilots it creates are to the copilots for Dynamics and Power Platform.

That’s the analogy we make. Then the second piece is we have what we call is basically CAPI as an internal name, but these are the Copilot APIs, which are a shared set of capabilities, which all the teams draw from to do things like model selection, tech summarization, retrieval- augmentation generation.

It’s built by a guy called Donald Kossmann, a distinguished scientist for us. Has a bunch of great experience in Microsoft Research, so his team builds out that common infrastructure.

Steve Mordue:
We’re going to see more changes throughout the products. We talked about Sophia as a trailblazer in figuring out what some of these other things will look like. We’re seeing things like, I think, we’re seeing customer voice now sunset.
I’m sure it has to do with AI or something else coming in to take the place of that better. Are there other areas we should be watching and expecting to see affected by AI sooner than others?

Charles Lamanna:
I would say the area where there’ll be a lot of rapid change, will be all points of integration with Office 365.

Steve Mordue:
Okay. Yeah, that makes sense.

Charles Lamanna:
Because what we’re finding is just the Copilot integration points vastly outperform and provide such a better experience than the classic, embedded form. I think a great example is the Outlook add-in that’s been around Dynamics forever versus the Copilot integration in Outlook. We’re going to have both. You can use both, but the Copilot integration in Outlook is just so much better.

I have an early release of that. I use that every day. It is just amazing. I think Office 365 integration, and the second would be anything having to do with data exploration, data preparation and data reporting will look very different. We talk about a world where if I can just ask a question and get the information I need back, why do you need things like a dashboard anymore?

Why do I need a pipeline dashboard? I can just ask whatever question I want and I can ask follow-up questions and tweaks. Why does a central team need to create a dashboard? I think data and analytics and reporting visualization will change. Probably the third bit is going to be a lot around Power Platform and low-code development. Low-code development has always been about drag-and-drop expressions.

We’re now going to add natural language to the mix. Probably the most successful copilot has been the copilot for Power Automate. Just the stats we see, the impact that has, the user feedback, people like using natural language alongside the visual designers. That’s going to really be the main way that people interact with Power Platform going forward.

Steve Mordue:
You think they’d prefer that to actually writing expressions?

Charles Lamanna:
Oh, yeah. Yes, exactly. I’ve been asking for the ability to write T-SQL very easily for on top of Dataverse to have an experience for it.
So I don’t have to use the TDS endpoint but maybe we won’t need that anymore. I’ll just be able to write natural language now instead, so yeah.

Steve Mordue:
Yeah. Well, it had me wondering about things like Viva Sales. I was thinking when we talked about that back when it launched, I would say, “Okay, that’s the first of many Viva services, Viva, this, that and the other.” But now it feels like copilots are stepping on Viva’s body, not just the feet. It’s stepping on the whole body and that’s just going to get consumed into a copilot, which makes sense.

Now I want to ask you a couple of questions that are more specific to me, and you know what we’re doing. I got excited when I saw the copilot, the ability not using copilots to build a Power App, but the ability to build a copilot to put into your Power app. Your model-driven in particular, but also Canvas because obviously there’s a ton of customers out there of all sizes that have taken the Power Platform and built completely bespoke solutions to major problems out there.

They’re looking at these copilots thinking, “Wow, that’s cool. I wish I could have one.” Well, we’ve got this capability now and what is the… Please don’t tell me that’s going away. I want to hear that that’s continuing to go. Also, these folks have been able to utilize the Outlook app up to now. If there’s this so much better, cooler Copilot app instead of the Outlook app, will they be in a position to use both of those tools?

Charles Lamanna:
Yeah. The first thing is embedding copilots in your Power App definitely is not going away.

Steve Mordue:
Thank you.

Charles Lamanna:
Because what’s incredible is what customers are asking me, left, right and center, “I love Copilot. How can I put Copilot in my own line of business apps?” What my answer is right now is the easiest way. No matter who you are, the easiest way is to build a Power App and use an embedded copilot capability.

That’s so much easier than even if you can write code, this is the fastest way to do it. We think in the future, people won’t even use applications that don’t have copilots. The analogy I make is imagine if you went to work and your boss was like, “Here’s the UI for you to do your job,” and it was a green screen terminal. You’d run screaming out of the office on your first day.

Steve Mordue:
Too much work.

Charles Lamanna:
Exactly. You’re like, “I’m not doing this. I expect graphical user interfaces. There’s a thing called a mouse and keyboard I like.” The same thing will be true for Copilot in no time. People will just not use applications which don’t have a copilot, so that’s one. The second is yes, absolutely, but the plan is for our extensibility for Copilot in M365 to be available to third parties and others. We’re defining that as we speak.

Because like I said, we’re learning and experimenting so much right now. I think we’ve rewritten Copilot for sales like three times in the past 18 months as we’ve evolved our understanding. When the platform is more steady and stable, absolutely we plan having third-party extensibility there as well. That should be 2024. That’s not something that’s years away. That’s 2024.

Steve Mordue:
Nothing is years away anymore.

Charles Lamanna:
Yes, that’s true. That’s true.

Steve Mordue:
That’s going to be a big challenge to solve, because as you said, this idea of a copilot that basically follows me around wherever I’m at work, whatever it is I’m working on. If I’m in sales, if I’m writing an email making a PowerPoint, it’s the same copilot or the illusion of the same copilot that’s passing me off seamlessly to the other copilots. Bringing with it the bag of information that it has generated so far to be able to do its job.

Then now we’re going to introduce, “Oh, okay. Now let’s open this up for this wild west of ISVs to stick their stuff in here.” That seems like almost a bigger problem to solve because you have control of everything internally. Bringing in the ISVs sounds like that will be a big one to solve.

Charles Lamanna:
Yeah. It won’t be easy, and we have to be very careful. Otherwise, it’ll end up like Internet Explorer with all those add-ins that people used to add like 20 years ago.

Steve Mordue:
Yeah, exactly.

Charles Lamanna:
We don’t want your copilot to have 30 add-ins, which take up the whole screen. You’re exactly right.

We have to figure out the right platform and the right approval process, and the right opt-in process to make sure we don’t compromise on the magic of Copilot.

Steve Mordue:
Yeah. Well, as I do every year, because I’ve taken up a lot of your time, as I do every year when I talk to you. Just coming off of, I know we’ve talked AI, Copilot, Copilot, but what else did you think at Ignite in particular, people should be looking at that maybe is getting a little overshadowed by all the Copilot stuff?

Because there’s always those pet projects that you know man, if people knew more about this, this thing would just sing. Just for some reason or another, don’t get to see it right now. We know why they don’t get to see it because buried by copilots, but what else out there?

Charles Lamanna:
I would say just two things. The first is we shipped a ton of amazing governance and security improvements to Power Platform, which carry through, of course, to Dynamics 365 as well. I think one of the things we’re really proud of is this idea that an application built on Dataverse in the Power Platform and Dynamics 365 applications, are the most secure applications you can build out of the box.

There’s a whole bunch of improvements there. There’s a great deep dive breakout session on that as well, worth checking out from Ignite. The second is in Dynamics 365 Finance, we have been announcing a bunch of big enhancements to financial planning and analytics. This has been one of the biggest customer asks for the last few years, is to have native FP&A capabilities in Dynamics 365 Finance, and we have it now.

I’d say go check it out. It’s in the box, makes it easy to have a really advanced. There’s AI but it’s not generative AI, it’s just standard budgeting, planning analytics. But it’s just something that people have been asking for a long time and the team has done a huge amount of work to close this gap because it has not been an easy one. That is now all available out there, so those are probably the two takeaways.

Steve Mordue:
That definitely is the harder side of the house to move into new stuff. Salespeople are all ready to try anything. They’re open, they’re game for, “Hey, yeah, sure, I’ll try this.” But you get in the ERP side of the house, they’re very resistant to want to do anything.

They were the slowest to move to the cloud, the slowest to take up new features. You got to tread a little more slowly there, I guess, than in some of the other spaces.

Charles Lamanna:
Yeah. That’s exactly right.

Steve Mordue:
Very cool, Charles. Thank you very much for taking the time to chat with me. I think it’s probably getting to be about your lunchtime by now.

Charles Lamanna:
I wish. I got three 25 minute meetings back-to-back right after this.

Steve Mordue:
Well, I guess I caught you at the perfect time then.

Charles Lamanna:
You did. You did. It’s always great to connect with you, Steve.

Steve Mordue:
All right, man. Thanks a lot, Charles.

Charles Lamanna:
Yeah. Yeah, have a good one.

 

Add your thoughts below, just don’t pimp your stuff on my blog :)

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MVP Logo Horizontal Preferred Cyan300 RGB 300ppi

Subscribe to Steve's Blog

* indicates required
Follow Steve on LinkedIn

My Recent Posts

Pin It on Pinterest

Share This